Oct 02 18:17:57 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 18:17:57 crc restorecon[4728]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:57 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 18:17:58 crc restorecon[4728]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 18:17:59 crc kubenswrapper[4909]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 18:17:59 crc kubenswrapper[4909]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 18:17:59 crc kubenswrapper[4909]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 18:17:59 crc kubenswrapper[4909]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 18:17:59 crc kubenswrapper[4909]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 18:17:59 crc kubenswrapper[4909]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.319095 4909 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331369 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331414 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331424 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331435 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331444 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331454 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331462 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331471 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331480 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331488 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331496 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331504 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331512 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331524 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331534 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331543 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331554 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331567 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331576 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331585 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331593 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331601 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331609 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331618 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331626 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331634 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331643 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331653 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331661 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331669 4909 feature_gate.go:330] unrecognized feature gate: Example Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331677 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331685 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331693 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331703 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331711 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331719 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331726 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331735 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331744 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331755 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331766 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331775 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331790 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331802 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331812 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331822 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331831 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331840 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331847 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331855 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331863 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331871 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331880 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331887 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331895 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331905 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331914 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331921 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331929 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331937 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331945 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331953 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331963 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331971 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331981 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.331990 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.332000 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.332011 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.332065 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.332080 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.332089 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332286 4909 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332316 4909 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332339 4909 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332350 4909 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332363 4909 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332372 4909 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332384 4909 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332396 4909 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332405 4909 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332414 4909 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332424 4909 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332436 4909 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332445 4909 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332455 4909 flags.go:64] FLAG: --cgroup-root="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332464 4909 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332473 4909 flags.go:64] FLAG: --client-ca-file="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332482 4909 flags.go:64] FLAG: --cloud-config="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332490 4909 flags.go:64] FLAG: --cloud-provider="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332499 4909 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332511 4909 flags.go:64] FLAG: --cluster-domain="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332520 4909 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332529 4909 flags.go:64] FLAG: --config-dir="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332537 4909 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332548 4909 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332559 4909 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332568 4909 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332577 4909 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332587 4909 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332596 4909 flags.go:64] FLAG: --contention-profiling="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332605 4909 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332614 4909 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332623 4909 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332632 4909 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332645 4909 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332654 4909 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332663 4909 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332672 4909 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332681 4909 flags.go:64] FLAG: --enable-server="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332690 4909 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332702 4909 flags.go:64] FLAG: --event-burst="100" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332713 4909 flags.go:64] FLAG: --event-qps="50" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332722 4909 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332731 4909 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332740 4909 flags.go:64] FLAG: --eviction-hard="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332751 4909 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332760 4909 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332769 4909 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332780 4909 flags.go:64] FLAG: --eviction-soft="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332789 4909 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332798 4909 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332807 4909 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332816 4909 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332825 4909 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332833 4909 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332842 4909 flags.go:64] FLAG: --feature-gates="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332853 4909 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332863 4909 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332873 4909 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332882 4909 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332892 4909 flags.go:64] FLAG: --healthz-port="10248" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332902 4909 flags.go:64] FLAG: --help="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332911 4909 flags.go:64] FLAG: --hostname-override="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332920 4909 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332929 4909 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332938 4909 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332948 4909 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332956 4909 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332966 4909 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332974 4909 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332983 4909 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.332994 4909 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333003 4909 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333013 4909 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333021 4909 flags.go:64] FLAG: --kube-reserved="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333060 4909 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333069 4909 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333079 4909 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333088 4909 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333097 4909 flags.go:64] FLAG: --lock-file="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333106 4909 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333115 4909 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333124 4909 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333139 4909 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333149 4909 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333159 4909 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333168 4909 flags.go:64] FLAG: --logging-format="text" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333177 4909 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333187 4909 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333196 4909 flags.go:64] FLAG: --manifest-url="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333204 4909 flags.go:64] FLAG: --manifest-url-header="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333217 4909 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333227 4909 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333238 4909 flags.go:64] FLAG: --max-pods="110" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333247 4909 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333256 4909 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333265 4909 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333276 4909 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333287 4909 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333296 4909 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333306 4909 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333328 4909 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333338 4909 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333347 4909 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333356 4909 flags.go:64] FLAG: --pod-cidr="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333366 4909 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333380 4909 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333389 4909 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333398 4909 flags.go:64] FLAG: --pods-per-core="0" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333407 4909 flags.go:64] FLAG: --port="10250" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333417 4909 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333425 4909 flags.go:64] FLAG: --provider-id="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333435 4909 flags.go:64] FLAG: --qos-reserved="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333444 4909 flags.go:64] FLAG: --read-only-port="10255" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333452 4909 flags.go:64] FLAG: --register-node="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333461 4909 flags.go:64] FLAG: --register-schedulable="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333470 4909 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333485 4909 flags.go:64] FLAG: --registry-burst="10" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333494 4909 flags.go:64] FLAG: --registry-qps="5" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333503 4909 flags.go:64] FLAG: --reserved-cpus="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333527 4909 flags.go:64] FLAG: --reserved-memory="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333538 4909 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333547 4909 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333557 4909 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333566 4909 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333575 4909 flags.go:64] FLAG: --runonce="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333584 4909 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333593 4909 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333603 4909 flags.go:64] FLAG: --seccomp-default="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333612 4909 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333620 4909 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333630 4909 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333640 4909 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333649 4909 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333658 4909 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333667 4909 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333676 4909 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333685 4909 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333695 4909 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333704 4909 flags.go:64] FLAG: --system-cgroups="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333713 4909 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333728 4909 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333737 4909 flags.go:64] FLAG: --tls-cert-file="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333746 4909 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333758 4909 flags.go:64] FLAG: --tls-min-version="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333767 4909 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333776 4909 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333784 4909 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333833 4909 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333843 4909 flags.go:64] FLAG: --v="2" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333856 4909 flags.go:64] FLAG: --version="false" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333867 4909 flags.go:64] FLAG: --vmodule="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333878 4909 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.333888 4909 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334189 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334201 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334212 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334222 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334230 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334240 4909 feature_gate.go:330] unrecognized feature gate: Example Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334251 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334261 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334271 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334280 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334289 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334297 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334306 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334315 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334324 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334332 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334341 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334348 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334356 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334364 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334372 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334380 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334388 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334395 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334403 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334410 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334418 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334428 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334437 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334454 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334462 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334470 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334478 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334486 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334494 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334502 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334510 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334519 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334528 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334536 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334545 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334552 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334563 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334572 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334581 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334589 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334597 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334605 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334613 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334621 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334634 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334642 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334650 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334658 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334666 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334673 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334681 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334689 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334702 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334714 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334726 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334738 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334749 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334759 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334770 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334780 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334790 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334800 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334811 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334821 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.334830 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.334842 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.347793 4909 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.347859 4909 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.347980 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348003 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348017 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348056 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348067 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348075 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348084 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348095 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348104 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348113 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348121 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348129 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348138 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348146 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348154 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348163 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348172 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348181 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348191 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348200 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348208 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348216 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348223 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348231 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348239 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348247 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348254 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348262 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348269 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348277 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348285 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348293 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348300 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348308 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348320 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348328 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348336 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348344 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348351 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348360 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348367 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348375 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348382 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348390 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348397 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348405 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348413 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348420 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348428 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348438 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348448 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348457 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348468 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348478 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348487 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348496 4909 feature_gate.go:330] unrecognized feature gate: Example Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348504 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348513 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348522 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348531 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348539 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348547 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348554 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348562 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348570 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348577 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348586 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348593 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348601 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348609 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348619 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.348632 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348884 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348900 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348909 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348918 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348926 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348936 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348945 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348953 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348961 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348969 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348978 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348986 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.348995 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349003 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349011 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349019 4909 feature_gate.go:330] unrecognized feature gate: Example Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349053 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349061 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349069 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349080 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349089 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349100 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349109 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349117 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349126 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349134 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349144 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349153 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349161 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349169 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349178 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349186 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349194 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349203 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349213 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349221 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349229 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349237 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349245 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349252 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349261 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349268 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349276 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349284 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349296 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349304 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349315 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349324 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349333 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349341 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349349 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349358 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349366 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349374 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349382 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349390 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349398 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349408 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349416 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349425 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349433 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349442 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349450 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349458 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349466 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349474 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349483 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349491 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349499 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349506 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.349516 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.349529 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.350834 4909 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.357672 4909 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.357896 4909 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.361003 4909 server.go:997] "Starting client certificate rotation" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.361076 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.361266 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-12 13:02:54.491891979 +0000 UTC Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.361376 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 978h44m55.130520104s for next certificate rotation Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.400023 4909 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.407627 4909 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.428150 4909 log.go:25] "Validated CRI v1 runtime API" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.469179 4909 log.go:25] "Validated CRI v1 image API" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.471515 4909 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.477691 4909 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-18-13-38-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.477746 4909 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.506881 4909 manager.go:217] Machine: {Timestamp:2025-10-02 18:17:59.502593291 +0000 UTC m=+0.690089200 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799886 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7c418406-42b5-4f83-a45d-1cef2c7c1a53 BootID:de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:10:22:bb Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:10:22:bb Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:19:66:3d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c3:4a:8d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:fb:0e:a6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5d:04:c6 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b7:9d:b7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:a1:26:62:e9:73 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:7d:e5:65:5b:87 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.507399 4909 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.507619 4909 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.508901 4909 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.509257 4909 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.509307 4909 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.510798 4909 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.510828 4909 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.511469 4909 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.511500 4909 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.511836 4909 state_mem.go:36] "Initialized new in-memory state store" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.511962 4909 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.519674 4909 kubelet.go:418] "Attempting to sync node with API server" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.519707 4909 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.519730 4909 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.519750 4909 kubelet.go:324] "Adding apiserver pod source" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.519769 4909 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.527178 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.527192 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.527338 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.527365 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.527953 4909 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.529525 4909 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.536012 4909 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537740 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537770 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537779 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537789 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537833 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537843 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537854 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537873 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537885 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537895 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537928 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.537941 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.539587 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.540198 4909 server.go:1280] "Started kubelet" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.541890 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:17:59 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.542724 4909 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.542723 4909 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.543596 4909 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.545111 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.545682 4909 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.545839 4909 server.go:460] "Adding debug handlers to kubelet server" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.546130 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:41:27.171167138 +0000 UTC Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.546175 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1343h23m27.624995538s for next certificate rotation Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.546204 4909 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.546308 4909 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.546345 4909 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.546397 4909 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.554151 4909 factory.go:55] Registering systemd factory Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.554318 4909 factory.go:221] Registration of the systemd container factory successfully Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.562993 4909 factory.go:153] Registering CRI-O factory Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.563055 4909 factory.go:221] Registration of the crio container factory successfully Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.563136 4909 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.563105 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.563159 4909 factory.go:103] Registering Raw factory Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.563137 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="200ms" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.563181 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.563180 4909 manager.go:1196] Started watching for new ooms in manager Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.564731 4909 manager.go:319] Starting recovery of all containers Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.563010 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186abf6d0d7bcb42 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 18:17:59.54015725 +0000 UTC m=+0.727653119,LastTimestamp:2025-10-02 18:17:59.54015725 +0000 UTC m=+0.727653119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.567953 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568119 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568183 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568243 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568317 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568385 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568441 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568503 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568563 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568639 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568719 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568789 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568852 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568914 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.568978 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569055 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569144 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569200 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569257 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569343 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569412 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569478 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569536 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569596 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569660 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569714 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569780 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569874 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.569942 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.570002 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.570078 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.570151 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.575980 4909 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576068 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576119 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576140 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576159 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576175 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576188 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576201 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576217 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576235 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576251 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576264 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576277 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576295 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576307 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576324 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576339 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576351 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576367 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576381 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576396 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576415 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576432 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576448 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576463 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576477 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576493 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576507 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576519 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576532 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576543 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576555 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576569 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576584 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576599 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576613 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576626 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576641 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576653 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576665 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576687 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576704 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576717 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576746 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576760 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576774 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576787 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576802 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576816 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576829 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576842 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576857 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576872 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576886 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576900 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576914 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576927 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576941 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576956 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576969 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576984 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.576996 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577007 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577019 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577104 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577117 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577132 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577147 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577162 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577175 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577186 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577198 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577211 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577230 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577246 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577262 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577274 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577287 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577302 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577319 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577335 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577350 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577363 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577374 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577385 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577398 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577411 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577423 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577435 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577448 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577466 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577480 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577492 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577505 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577517 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577534 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577549 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577564 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577577 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577600 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577612 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.577623 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578727 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578769 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578782 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578794 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578805 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578816 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578827 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578839 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578849 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578859 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578870 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578880 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578890 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578900 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578909 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578920 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578930 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578941 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578951 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578961 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578971 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578982 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.578994 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579007 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579016 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579067 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579080 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579090 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579100 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579109 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579119 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579130 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579141 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579151 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579161 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579171 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579182 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579191 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579200 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579210 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579220 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579230 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579240 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579250 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579258 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579269 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579280 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579290 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579300 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579311 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579320 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579329 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579339 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579349 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579359 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579369 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579379 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579389 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579402 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579411 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579420 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579431 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579441 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579450 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579467 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579476 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579485 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579496 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579505 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579515 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579525 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579534 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579545 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579553 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579563 4909 reconstruct.go:97] "Volume reconstruction finished" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.579572 4909 reconciler.go:26] "Reconciler: start to sync state" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.586896 4909 manager.go:324] Recovery completed Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.596345 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.598946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.598979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.598991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.601277 4909 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.601305 4909 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.601328 4909 state_mem.go:36] "Initialized new in-memory state store" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.605136 4909 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.606982 4909 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.607074 4909 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.607123 4909 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.607213 4909 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 18:17:59 crc kubenswrapper[4909]: W1002 18:17:59.609339 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.609429 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.619698 4909 policy_none.go:49] "None policy: Start" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.620368 4909 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.620388 4909 state_mem.go:35] "Initializing new in-memory state store" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.646532 4909 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.672173 4909 manager.go:334] "Starting Device Plugin manager" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.672423 4909 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.672450 4909 server.go:79] "Starting device plugin registration server" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.672900 4909 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.672922 4909 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.673094 4909 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.673194 4909 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.673204 4909 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.680203 4909 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.707714 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.707803 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709082 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709215 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709373 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.709894 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710291 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710586 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.710622 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711091 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711194 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711611 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711790 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.711853 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.712884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.713052 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.713072 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.713668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.713720 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.713739 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.763845 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="400ms" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.773753 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.775109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.775172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.775196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.775234 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.775756 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780847 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780905 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780926 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780942 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780958 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.780986 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781091 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781134 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781165 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781195 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781228 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.781325 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.882361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.882593 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.882898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.882851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883165 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883190 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883192 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883226 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883342 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883352 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883379 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883504 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883525 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883635 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883609 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883635 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883748 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.883831 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.976114 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.977591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.977647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.977665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:17:59 crc kubenswrapper[4909]: I1002 18:17:59.977699 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:17:59 crc kubenswrapper[4909]: E1002 18:17:59.978365 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.037913 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.056687 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.064672 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.083385 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.087838 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.100625 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d508485dba6fbe7fd9f4d1b253d74d1265939601d5d03cf308f8e9ba8d3efde7 WatchSource:0}: Error finding container d508485dba6fbe7fd9f4d1b253d74d1265939601d5d03cf308f8e9ba8d3efde7: Status 404 returned error can't find the container with id d508485dba6fbe7fd9f4d1b253d74d1265939601d5d03cf308f8e9ba8d3efde7 Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.102975 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-126c9f331b901cc240c8ff84f184d1fc5079f37341daa1094226d0d422f1a3a4 WatchSource:0}: Error finding container 126c9f331b901cc240c8ff84f184d1fc5079f37341daa1094226d0d422f1a3a4: Status 404 returned error can't find the container with id 126c9f331b901cc240c8ff84f184d1fc5079f37341daa1094226d0d422f1a3a4 Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.123374 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8cb411464ca20d7dd411c611b09fbea0234e036f4dce0fb7ef5b4a47cbf1b626 WatchSource:0}: Error finding container 8cb411464ca20d7dd411c611b09fbea0234e036f4dce0fb7ef5b4a47cbf1b626: Status 404 returned error can't find the container with id 8cb411464ca20d7dd411c611b09fbea0234e036f4dce0fb7ef5b4a47cbf1b626 Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.125482 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-fe501c97f30b112a23bb8bc23df65909a572dea49d02beaa6ebca7abf58b4725 WatchSource:0}: Error finding container fe501c97f30b112a23bb8bc23df65909a572dea49d02beaa6ebca7abf58b4725: Status 404 returned error can't find the container with id fe501c97f30b112a23bb8bc23df65909a572dea49d02beaa6ebca7abf58b4725 Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.165107 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="800ms" Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.358294 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.358420 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.378965 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.380279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.380319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.380335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.380366 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.380785 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.542788 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.579697 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.579809 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.611326 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d508485dba6fbe7fd9f4d1b253d74d1265939601d5d03cf308f8e9ba8d3efde7"} Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.612617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fe501c97f30b112a23bb8bc23df65909a572dea49d02beaa6ebca7abf58b4725"} Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.613707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cb411464ca20d7dd411c611b09fbea0234e036f4dce0fb7ef5b4a47cbf1b626"} Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.615436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30d5f74649f53b97a56921a3348a79226aa60650f819187a12208088011bb4a2"} Oct 02 18:18:00 crc kubenswrapper[4909]: I1002 18:18:00.617385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"126c9f331b901cc240c8ff84f184d1fc5079f37341daa1094226d0d422f1a3a4"} Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.624259 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.624350 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:18:00 crc kubenswrapper[4909]: W1002 18:18:00.782558 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.782663 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:18:00 crc kubenswrapper[4909]: E1002 18:18:00.966102 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="1.6s" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.180887 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.183473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.183854 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.183871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.183911 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:18:01 crc kubenswrapper[4909]: E1002 18:18:01.184634 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.542974 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.623641 4909 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b" exitCode=0 Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.623764 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.623825 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.625593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.625649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.625667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.627224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.627292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.627322 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.629693 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="76976bc4dab27c2bc9e8cb12293877f800b24d6208dc2a1fc06cf1fc679d6692" exitCode=0 Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.629804 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"76976bc4dab27c2bc9e8cb12293877f800b24d6208dc2a1fc06cf1fc679d6692"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.630243 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.631870 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1" exitCode=0 Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.631980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.632292 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.632853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.632891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.632904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.633897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.633979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.634002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.638903 4909 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c" exitCode=0 Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.638966 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c"} Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.639014 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.640393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.640452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.640471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.641222 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.642499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.642544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:01 crc kubenswrapper[4909]: I1002 18:18:01.642561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.544060 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:02 crc kubenswrapper[4909]: E1002 18:18:02.567236 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="3.2s" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.644385 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e88865d8c3af16ebeebe7c058415570d789fea42749177b535120e7edc05a670" exitCode=0 Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.644465 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e88865d8c3af16ebeebe7c058415570d789fea42749177b535120e7edc05a670"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.644593 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.646168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.646209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.646225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.649518 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.649557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.649573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.649581 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.652919 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.652971 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.654145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.654219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.654245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.658382 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.658416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.658427 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.658415 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.659575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.659644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.659671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.661994 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec"} Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.662154 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.663794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.663832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.663845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.785170 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.786664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.786700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.786715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.786749 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:18:02 crc kubenswrapper[4909]: E1002 18:18:02.787331 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Oct 02 18:18:02 crc kubenswrapper[4909]: W1002 18:18:02.819758 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:02 crc kubenswrapper[4909]: E1002 18:18:02.819856 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:18:02 crc kubenswrapper[4909]: I1002 18:18:02.888546 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:03 crc kubenswrapper[4909]: W1002 18:18:03.254492 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Oct 02 18:18:03 crc kubenswrapper[4909]: E1002 18:18:03.254621 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.668348 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9d58f3537f48f50ff2f6359e7f7307b283fd3e69a1b3c9d6fd92e9beb93cd795" exitCode=0 Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.668512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9d58f3537f48f50ff2f6359e7f7307b283fd3e69a1b3c9d6fd92e9beb93cd795"} Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.668679 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.669977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.670017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.670067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.673365 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.684758 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845" exitCode=255 Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.684927 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.685619 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.686097 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845"} Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.686219 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.686674 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.686713 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.687648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.687684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.687697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.688410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.688442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.688455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689493 4909 scope.go:117] "RemoveContainer" containerID="9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:03 crc kubenswrapper[4909]: I1002 18:18:03.689840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.049045 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.692631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3294328b2336d9690415ac34b8143c7c0eae10ea727139ee938954c91897c536"} Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.692712 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5efed0c035a70f9cafb36e0314b45dc2ef2c0b190405f2d29aebb90bbf0076e6"} Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.692765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eadf1786888de2315cad908fcbc7cc00876ae9daa9c886a2b17c39a804bd3683"} Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.695149 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.697390 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5"} Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.697508 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.697563 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.697516 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.699001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.699079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.699097 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.699286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.699344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:04 crc kubenswrapper[4909]: I1002 18:18:04.699368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.519767 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.529252 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.706988 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.706969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2a67a53d782ae1009e7016ff28977fd72aaa0d3083c4f0ba47e48b9bc90c199"} Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707187 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"829f91c01ec24d0b7cf680494ba4588058de95d0ed144da2f718b5534ea9394d"} Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707251 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707252 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.707915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.708816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.708864 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.708885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.708860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.709006 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.709185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.987966 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.989599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.989660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.989684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:05 crc kubenswrapper[4909]: I1002 18:18:05.989725 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.709509 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.709636 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.709664 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.710502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.710531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.710542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.711810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.711839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.711866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.711886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.711932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:06 crc kubenswrapper[4909]: I1002 18:18:06.711966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:07 crc kubenswrapper[4909]: I1002 18:18:07.717767 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:07 crc kubenswrapper[4909]: I1002 18:18:07.717964 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:07 crc kubenswrapper[4909]: I1002 18:18:07.719567 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:07 crc kubenswrapper[4909]: I1002 18:18:07.719615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:07 crc kubenswrapper[4909]: I1002 18:18:07.719628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.218295 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.218559 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.220220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.220273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.220289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.351485 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.351741 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.353307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.353518 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.353575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.924286 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.924547 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.926677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.926771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:08 crc kubenswrapper[4909]: I1002 18:18:08.926798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:09 crc kubenswrapper[4909]: E1002 18:18:09.680299 4909 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 18:18:10 crc kubenswrapper[4909]: I1002 18:18:10.719017 4909 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 18:18:10 crc kubenswrapper[4909]: I1002 18:18:10.719225 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.399273 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.399607 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.401667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.401733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.401756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.408202 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.725047 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.726356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.726430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:11 crc kubenswrapper[4909]: I1002 18:18:11.726455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:13 crc kubenswrapper[4909]: W1002 18:18:13.497854 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 02 18:18:13 crc kubenswrapper[4909]: I1002 18:18:13.497960 4909 trace.go:236] Trace[10386567]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 18:18:03.496) (total time: 10001ms): Oct 02 18:18:13 crc kubenswrapper[4909]: Trace[10386567]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:18:13.497) Oct 02 18:18:13 crc kubenswrapper[4909]: Trace[10386567]: [10.001216658s] [10.001216658s] END Oct 02 18:18:13 crc kubenswrapper[4909]: E1002 18:18:13.497986 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 02 18:18:13 crc kubenswrapper[4909]: I1002 18:18:13.543373 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 02 18:18:13 crc kubenswrapper[4909]: W1002 18:18:13.811515 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 02 18:18:13 crc kubenswrapper[4909]: I1002 18:18:13.811676 4909 trace.go:236] Trace[1607589878]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 18:18:03.809) (total time: 10001ms): Oct 02 18:18:13 crc kubenswrapper[4909]: Trace[1607589878]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:18:13.811) Oct 02 18:18:13 crc kubenswrapper[4909]: Trace[1607589878]: [10.001973261s] [10.001973261s] END Oct 02 18:18:13 crc kubenswrapper[4909]: E1002 18:18:13.811714 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.049988 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.050136 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.205340 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.205821 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.207685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.207750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.207770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.268115 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.635199 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.635282 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.732501 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.733560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.733613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.733626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:14 crc kubenswrapper[4909]: I1002 18:18:14.760582 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 18:18:15 crc kubenswrapper[4909]: I1002 18:18:15.735160 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:15 crc kubenswrapper[4909]: I1002 18:18:15.736110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:15 crc kubenswrapper[4909]: I1002 18:18:15.736160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:15 crc kubenswrapper[4909]: I1002 18:18:15.736173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.098334 4909 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.530992 4909 apiserver.go:52] "Watching apiserver" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.541713 4909 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.542113 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.542552 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.542676 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.542767 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:18 crc kubenswrapper[4909]: E1002 18:18:18.542871 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:18 crc kubenswrapper[4909]: E1002 18:18:18.543070 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.543127 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.543227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.543705 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:18 crc kubenswrapper[4909]: E1002 18:18:18.543751 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.545700 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.545895 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.546306 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.546419 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.546590 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.547143 4909 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.548273 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.548404 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.549398 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.549510 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.580531 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.593234 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.613954 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.627843 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.641063 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.655631 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:18 crc kubenswrapper[4909]: I1002 18:18:18.667112 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.054236 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.060279 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.067958 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.070797 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.081732 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.090527 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.102988 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.119782 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.130877 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.148390 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.161423 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.177992 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.193847 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.208458 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.232132 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.243058 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.426224 4909 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.621366 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.633473 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.634444 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.635875 4909 trace.go:236] Trace[1724289901]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 18:18:07.407) (total time: 12228ms): Oct 02 18:18:19 crc kubenswrapper[4909]: Trace[1724289901]: ---"Objects listed" error: 12228ms (18:18:19.635) Oct 02 18:18:19 crc kubenswrapper[4909]: Trace[1724289901]: [12.22840526s] [12.22840526s] END Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.635901 4909 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.643084 4909 trace.go:236] Trace[720013646]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 18:18:09.604) (total time: 10038ms): Oct 02 18:18:19 crc kubenswrapper[4909]: Trace[720013646]: ---"Objects listed" error: 10038ms (18:18:19.642) Oct 02 18:18:19 crc kubenswrapper[4909]: Trace[720013646]: [10.038921449s] [10.038921449s] END Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.643112 4909 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.643213 4909 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.644132 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.661220 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.679097 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.698290 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.710151 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.727566 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744363 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744420 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744472 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744492 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744550 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744596 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744617 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744639 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744658 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744678 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744703 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744724 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744917 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744919 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744942 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744967 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.744994 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745070 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745095 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745152 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745180 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745203 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745225 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745253 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745279 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745304 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745381 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745429 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745455 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745503 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745532 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745555 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745578 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745612 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745636 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745660 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745714 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745704 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745739 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745828 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745844 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745936 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.745980 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746021 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746173 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746209 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746234 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746259 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746282 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746308 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746330 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746377 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746406 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746431 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746475 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746499 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746523 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746554 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746585 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746608 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746635 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746667 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746709 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746744 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746790 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746838 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746860 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746882 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746902 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746923 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746971 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747099 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747132 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747163 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747195 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747229 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747260 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747295 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747326 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747662 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747704 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747738 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747810 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747846 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747889 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747930 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748077 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748117 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748178 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748201 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748223 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748244 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748272 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748306 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748373 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748398 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748422 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748545 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748612 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748648 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748679 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748747 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748783 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748842 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748875 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748911 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748948 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748982 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749081 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749118 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749156 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749192 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749261 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749301 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749368 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749404 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749429 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749501 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749523 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749544 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749637 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749672 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749706 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749741 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749791 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749847 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.749881 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750063 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750090 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750218 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750242 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750266 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750371 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750443 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750476 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750510 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750544 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750592 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750619 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750645 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750669 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750702 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750733 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750759 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750845 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750935 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750961 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750987 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751057 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751101 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751128 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751176 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751201 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751226 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751276 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751300 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751324 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751405 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751433 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751463 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751547 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751602 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746109 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746117 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746362 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746612 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.746662 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.747198 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.748396 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750198 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750228 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750459 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750612 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750721 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.750984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751189 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751313 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751325 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751370 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751406 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.751954 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.752219 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.752498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.752544 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.752810 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.752859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.753446 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.754269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.756045 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.756255 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.757000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.758178 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.758332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.758462 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.758658 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.758831 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.758592 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.760233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.760299 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.760785 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.761041 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.761145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.761469 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762267 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762373 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762542 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762646 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762755 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762863 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.762891 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.763504 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.763879 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.763971 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.764176 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.764448 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.753074 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.764785 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.764838 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.764889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.764994 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765066 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765104 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765138 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765162 4909 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765182 4909 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765202 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765224 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765229 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765226 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765244 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765315 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765332 4909 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765343 4909 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765358 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765370 4909 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765380 4909 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765393 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765403 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765412 4909 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765422 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.765432 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.766240 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.766315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.766493 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.766512 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.766642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.766934 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767108 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767363 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767392 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767407 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767420 4909 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767434 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767447 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767458 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767468 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767478 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767488 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767499 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767509 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767518 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767528 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767538 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767547 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767559 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767571 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767580 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767590 4909 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767599 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767608 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767617 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767628 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767638 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.767648 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.768597 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.768969 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769000 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769013 4909 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769049 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769063 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769074 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769085 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769096 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769106 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769116 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769125 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769135 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769144 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.769161 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.769859 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:18:20.269836302 +0000 UTC m=+21.457332171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.771268 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.771307 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.771453 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.772530 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.772814 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.772861 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:20.272846425 +0000 UTC m=+21.460342284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.772850 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.773782 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.773987 4909 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.777910 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.778301 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.778356 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.778406 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.778438 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.778458 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.778710 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.778753 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.778779 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.779152 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.779393 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.779915 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.780389 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.780404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.780589 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.780753 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.780850 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.781331 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.781411 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.781503 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.781918 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.788141 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.788309 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.788383 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.789361 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.790854 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.791180 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.792385 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.793394 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.793797 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.793815 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.794257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795344 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795369 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795533 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795782 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795838 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.795932 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.796051 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.796293 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.796750 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.796919 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.797386 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.798011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.797942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.798747 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.798835 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.799079 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.799215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.799336 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.799634 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.799979 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800241 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800369 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800527 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800640 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800703 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800719 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.800851 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.798857 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.798890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801342 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801419 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801457 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801504 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801505 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801596 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801612 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801636 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801884 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.801960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.805854 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.807020 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.807288 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.807379 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.807994 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808190 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808368 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808569 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808729 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808811 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808934 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808940 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.808955 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.809223 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.809743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.810329 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.810521 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.810570 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.810962 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.811528 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.811746 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.811800 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.812109 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.812809 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.812869 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.812976 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.813263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.813645 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.813685 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.813944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.814315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.814372 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.814561 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.814830 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.814978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.815946 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.816083 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:20.316003304 +0000 UTC m=+21.503499183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.816086 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.817388 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.820169 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.824541 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:20.324513886 +0000 UTC m=+21.512009765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.820587 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.821318 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.817503 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.821468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.821682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.824689 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.826903 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.827723 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.828223 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.840324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.845324 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.847396 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.847630 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.847697 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.847976 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.848277 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.848426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.848702 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.854919 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.854948 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.854961 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:19 crc kubenswrapper[4909]: E1002 18:18:19.855010 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:20.354993265 +0000 UTC m=+21.542489124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.856649 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.872827 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876410 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876526 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876537 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876547 4909 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876555 4909 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876564 4909 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876573 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876582 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876591 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876600 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876609 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876617 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876626 4909 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876634 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876642 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876649 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876657 4909 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876665 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876673 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876682 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876691 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876700 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876707 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876715 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876723 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876732 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876740 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876748 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876756 4909 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876764 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876772 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876781 4909 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876790 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876798 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876808 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876816 4909 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876825 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876833 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876842 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876850 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876858 4909 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876866 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876875 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876883 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876891 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876899 4909 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876907 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876915 4909 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876923 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876932 4909 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876940 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876950 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876958 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876966 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876974 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876983 4909 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.876992 4909 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877000 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877009 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877017 4909 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877038 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877046 4909 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877055 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877064 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877072 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877082 4909 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877090 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877099 4909 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877107 4909 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877115 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877122 4909 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877130 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877138 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877146 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877155 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877164 4909 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877172 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877180 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877189 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877197 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877205 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877213 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877221 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877229 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877237 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877245 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877254 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877262 4909 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877270 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877279 4909 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877287 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877295 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877304 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877312 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877320 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877329 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877340 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877351 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877360 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877368 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877377 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877385 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877393 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877402 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877410 4909 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877418 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877427 4909 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877434 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877442 4909 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877450 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877458 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877466 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877474 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877482 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877490 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877498 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877507 4909 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877516 4909 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877524 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877531 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877539 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877547 4909 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877556 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877564 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877571 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877579 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877586 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877594 4909 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877602 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877616 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877624 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877632 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877640 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877651 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877695 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877705 4909 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.877759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.887270 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.900480 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.911305 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.920269 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:19 crc kubenswrapper[4909]: I1002 18:18:19.933134 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.058457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.068720 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.075959 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 18:18:20 crc kubenswrapper[4909]: W1002 18:18:20.096752 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-96684af03e0dc4954ead21867fe8edae93b701c874448adc95921e81897b93b9 WatchSource:0}: Error finding container 96684af03e0dc4954ead21867fe8edae93b701c874448adc95921e81897b93b9: Status 404 returned error can't find the container with id 96684af03e0dc4954ead21867fe8edae93b701c874448adc95921e81897b93b9 Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.256515 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.263174 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.268747 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.274201 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.282509 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.282586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.282647 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.282688 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:21.282675941 +0000 UTC m=+22.470171800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.282727 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:18:21.282700862 +0000 UTC m=+22.470196731 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.290670 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.303981 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.324794 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.378867 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.382863 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.382891 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.382909 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383010 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383053 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383071 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383083 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383059 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383130 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383038 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383117 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:21.383105905 +0000 UTC m=+22.570601764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383190 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:21.383182137 +0000 UTC m=+22.570677996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.383200 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:21.383195518 +0000 UTC m=+22.570691377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.409459 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.419008 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.427810 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.437148 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.452095 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.461873 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.471732 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.484796 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.495639 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.513325 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.608435 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.608536 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.608609 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.608757 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.608852 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:20 crc kubenswrapper[4909]: E1002 18:18:20.608960 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.749988 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7"} Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.750054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb"} Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.750065 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"96684af03e0dc4954ead21867fe8edae93b701c874448adc95921e81897b93b9"} Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.751221 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"82884a5f90cd272a98f2b2b63c4a4ce38f094f5223e6909a3bf9322a082d9788"} Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.752393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2"} Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.752421 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e91cb2bcd828fa315128a527f6414418d5b0938465b0f41a84af9db149f17fd2"} Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.764889 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.775978 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.789145 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.797428 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.814320 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.824400 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.839321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.841375 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kjgfb"] Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.841904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.843804 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.844992 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.845109 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.864227 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:20Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.887540 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc-hosts-file\") pod \"node-resolver-kjgfb\" (UID: \"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\") " pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.887591 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxx72\" (UniqueName: \"kubernetes.io/projected/b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc-kube-api-access-lxx72\") pod \"node-resolver-kjgfb\" (UID: \"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\") " pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.943990 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:20Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.988056 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc-hosts-file\") pod \"node-resolver-kjgfb\" (UID: \"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\") " pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.988104 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxx72\" (UniqueName: \"kubernetes.io/projected/b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc-kube-api-access-lxx72\") pod \"node-resolver-kjgfb\" (UID: \"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\") " pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.988189 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc-hosts-file\") pod \"node-resolver-kjgfb\" (UID: \"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\") " pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:20 crc kubenswrapper[4909]: I1002 18:18:20.990075 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:20Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.007960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.008655 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxx72\" (UniqueName: \"kubernetes.io/projected/b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc-kube-api-access-lxx72\") pod \"node-resolver-kjgfb\" (UID: \"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\") " pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.021073 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.033617 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.044603 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.056325 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.071596 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.083845 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.153417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kjgfb" Oct 02 18:18:21 crc kubenswrapper[4909]: W1002 18:18:21.166844 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2951bc3_5e48_4af4_b5ac_7d7c74aad1fc.slice/crio-86519e0bd059672570c577d88b895150089bc9d0b6d080eca9f2d5c18fb60273 WatchSource:0}: Error finding container 86519e0bd059672570c577d88b895150089bc9d0b6d080eca9f2d5c18fb60273: Status 404 returned error can't find the container with id 86519e0bd059672570c577d88b895150089bc9d0b6d080eca9f2d5c18fb60273 Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.267788 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4777h"] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.268351 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.269725 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7gpnt"] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.270135 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4scf8"] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.270230 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.271838 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6lnlx"] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.272267 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.272381 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.272633 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.279817 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.281378 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.285274 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.285623 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.285709 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.285635 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.285824 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.285931 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.286184 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.286565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.286681 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.289731 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.289792 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.291820 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.291953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-slash\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-etc-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.292133 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:18:23.292101588 +0000 UTC m=+24.479597487 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-k8s-cni-cncf-io\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292409 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-daemon-config\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292587 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292674 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-socket-dir-parent\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292744 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/100e4154-9795-41ea-8365-38ab076e57cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292779 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/100e4154-9795-41ea-8365-38ab076e57cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292814 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-cni-multus\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292848 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-cni-bin\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292879 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-etc-kubernetes\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292912 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-system-cni-dir\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292948 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31958374-7b04-45be-9509-c51e08f9afe2-proxy-tls\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.292982 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-os-release\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnbw7\" (UniqueName: \"kubernetes.io/projected/100e4154-9795-41ea-8365-38ab076e57cd-kube-api-access-xnbw7\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6xt7\" (UniqueName: \"kubernetes.io/projected/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-kube-api-access-f6xt7\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-systemd-units\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293157 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-kubelet\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293188 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-var-lib-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.293390 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.293677 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:23.293457 +0000 UTC m=+24.480952939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.293955 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-config\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-netd\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-netns\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294350 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-log-socket\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31958374-7b04-45be-9509-c51e08f9afe2-mcd-auth-proxy-config\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294503 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-systemd\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovn-node-metrics-cert\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-multus-certs\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294604 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-node-log\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294653 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-netns\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-bin\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-cnibin\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294784 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294804 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-script-lib\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294840 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294881 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp72d\" (UniqueName: \"kubernetes.io/projected/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-kube-api-access-qp72d\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-os-release\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294963 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295093 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.294964 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsw7r\" (UniqueName: \"kubernetes.io/projected/31958374-7b04-45be-9509-c51e08f9afe2-kube-api-access-lsw7r\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295096 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295231 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-env-overrides\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-conf-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-ovn\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295342 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-system-cni-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295389 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-kubelet\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-cnibin\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295458 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-cni-binary-copy\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295489 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-hostroot\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/31958374-7b04-45be-9509-c51e08f9afe2-rootfs\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.295555 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-cni-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.300828 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.315407 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.328827 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.340435 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.353852 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.366270 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.379176 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.398841 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-etc-kubernetes\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-system-cni-dir\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399639 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31958374-7b04-45be-9509-c51e08f9afe2-proxy-tls\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399664 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-os-release\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-cni-bin\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6xt7\" (UniqueName: \"kubernetes.io/projected/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-kube-api-access-f6xt7\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-systemd-units\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399746 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnbw7\" (UniqueName: \"kubernetes.io/projected/100e4154-9795-41ea-8365-38ab076e57cd-kube-api-access-xnbw7\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399762 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-var-lib-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399809 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-config\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399826 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399846 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-kubelet\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399862 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-netd\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-netns\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399895 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-log-socket\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31958374-7b04-45be-9509-c51e08f9afe2-mcd-auth-proxy-config\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399944 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovn-node-metrics-cert\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399959 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-systemd\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-node-log\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399996 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-multus-certs\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400045 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-netns\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400078 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-bin\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400093 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-cnibin\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp72d\" (UniqueName: \"kubernetes.io/projected/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-kube-api-access-qp72d\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400123 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-os-release\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsw7r\" (UniqueName: \"kubernetes.io/projected/31958374-7b04-45be-9509-c51e08f9afe2-kube-api-access-lsw7r\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400195 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-env-overrides\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-script-lib\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400228 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-ovn\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-system-cni-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401691 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-conf-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-kubelet\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401728 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401747 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-cni-binary-copy\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-hostroot\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401771 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31958374-7b04-45be-9509-c51e08f9afe2-mcd-auth-proxy-config\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401804 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/31958374-7b04-45be-9509-c51e08f9afe2-rootfs\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-multus-certs\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400318 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-log-socket\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401049 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-systemd-units\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-cnibin\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-cni-bin\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401924 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-kubelet\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-bin\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.401306 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-conf-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401347 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-ovn\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401364 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-netd\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401380 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-kubelet\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401432 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-netns\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-system-cni-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401779 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/31958374-7b04-45be-9509-c51e08f9afe2-rootfs\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400877 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-netns\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-os-release\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-node-log\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401243 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-systemd\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.401223 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-os-release\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402144 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-script-lib\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.402019 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-cni-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.402202 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-hostroot\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.400803 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-var-lib-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-cnibin\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.402258 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:23.402241762 +0000 UTC m=+24.589737641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-etc-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402312 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-cnibin\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-k8s-cni-cncf-io\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.401308 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-slash\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.402388 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:23.402365625 +0000 UTC m=+24.589861484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-system-cni-dir\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-etc-openvswitch\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-cni-dir\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-slash\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402422 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-run-k8s-cni-cncf-io\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-socket-dir-parent\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.399356 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-etc-kubernetes\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402390 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-daemon-config\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.401892 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402613 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-socket-dir-parent\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/100e4154-9795-41ea-8365-38ab076e57cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402550 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/100e4154-9795-41ea-8365-38ab076e57cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/100e4154-9795-41ea-8365-38ab076e57cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402799 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-cni-multus\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402814 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402873 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.402937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-host-var-lib-cni-multus\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.403002 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovn-node-metrics-cert\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.403060 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.403078 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.403114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-config\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: E1002 18:18:21.403128 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:23.403108738 +0000 UTC m=+24.590604807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.403422 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-multus-daemon-config\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.403518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/100e4154-9795-41ea-8365-38ab076e57cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.403529 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/100e4154-9795-41ea-8365-38ab076e57cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.403927 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-env-overrides\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.404440 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-cni-binary-copy\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.406802 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31958374-7b04-45be-9509-c51e08f9afe2-proxy-tls\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.419051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6xt7\" (UniqueName: \"kubernetes.io/projected/c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e-kube-api-access-f6xt7\") pod \"multus-7gpnt\" (UID: \"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\") " pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.419079 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.420962 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnbw7\" (UniqueName: \"kubernetes.io/projected/100e4154-9795-41ea-8365-38ab076e57cd-kube-api-access-xnbw7\") pod \"multus-additional-cni-plugins-6lnlx\" (UID: \"100e4154-9795-41ea-8365-38ab076e57cd\") " pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.425799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp72d\" (UniqueName: \"kubernetes.io/projected/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-kube-api-access-qp72d\") pod \"ovnkube-node-4scf8\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.428199 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsw7r\" (UniqueName: \"kubernetes.io/projected/31958374-7b04-45be-9509-c51e08f9afe2-kube-api-access-lsw7r\") pod \"machine-config-daemon-4777h\" (UID: \"31958374-7b04-45be-9509-c51e08f9afe2\") " pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.438926 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.469703 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.487837 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.499132 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.510515 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.522540 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.537943 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.555310 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.571321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.583919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.589134 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.599615 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7gpnt" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.601825 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: W1002 18:18:21.605925 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31958374_7b04_45be_9509_c51e08f9afe2.slice/crio-4c0af431b49bd774dc1f525cb03a4ee82c0f2ada4397f485b45380a1fcf94c26 WatchSource:0}: Error finding container 4c0af431b49bd774dc1f525cb03a4ee82c0f2ada4397f485b45380a1fcf94c26: Status 404 returned error can't find the container with id 4c0af431b49bd774dc1f525cb03a4ee82c0f2ada4397f485b45380a1fcf94c26 Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.610471 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.611833 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.612557 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.613298 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.613887 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.614474 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.614955 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.616578 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.617206 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.618322 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.618390 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.618984 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.620101 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.620792 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.620919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.622181 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.622758 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.624253 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.624812 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.625547 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.626349 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.627101 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.627686 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.628678 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.629232 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.630550 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.631268 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.634188 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.634950 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.635591 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.635578 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.636477 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.637056 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.637856 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.638406 4909 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.638503 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.640748 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.641505 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.642338 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.643898 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.647257 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.650257 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.651000 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.652146 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.652701 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.653476 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.654543 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.655503 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.655987 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.656954 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.657456 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.658582 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.659085 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.659934 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.660455 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.661176 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.661729 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.662712 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.664441 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.775351 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"4c0af431b49bd774dc1f525cb03a4ee82c0f2ada4397f485b45380a1fcf94c26"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.777097 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.777144 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"9e3044b5835521e010f1192f76b1a7d6b581f5f57b066d9292ba40c756595779"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.789225 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerStarted","Data":"395140f60bb829ba8e4064cce8cc74ea235048cb1d07da9751600730f33bacef"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.793530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerStarted","Data":"61f03b0ad7db033711d5561c64523d9b95c77ad7aeffb0b48152c2cf316cc749"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.794973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kjgfb" event={"ID":"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc","Type":"ContainerStarted","Data":"fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.795043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kjgfb" event={"ID":"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc","Type":"ContainerStarted","Data":"86519e0bd059672570c577d88b895150089bc9d0b6d080eca9f2d5c18fb60273"} Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.797727 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.810726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.821246 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.837532 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.854928 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.873144 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.889900 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.900892 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.918790 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.930670 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.949660 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.970686 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:21 crc kubenswrapper[4909]: I1002 18:18:21.987447 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:21Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.008451 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.027480 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.055622 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.103903 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.127916 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.157018 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.170685 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.183074 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.204885 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.227855 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.246888 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.269445 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.282503 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.607914 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.608001 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:22 crc kubenswrapper[4909]: E1002 18:18:22.608078 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:22 crc kubenswrapper[4909]: E1002 18:18:22.608197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.608665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:22 crc kubenswrapper[4909]: E1002 18:18:22.609094 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.800397 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081" exitCode=0 Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.800477 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081"} Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.802481 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5"} Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.802538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5"} Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.805266 4909 generic.go:334] "Generic (PLEG): container finished" podID="100e4154-9795-41ea-8365-38ab076e57cd" containerID="241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b" exitCode=0 Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.805378 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerDied","Data":"241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b"} Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.807145 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerStarted","Data":"b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd"} Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.808721 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575"} Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.835881 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.850910 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.872504 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.892865 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.917479 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.935393 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.948979 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.966503 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.982283 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:22 crc kubenswrapper[4909]: I1002 18:18:22.995782 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.013008 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.026086 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.037911 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.053963 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.068095 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.086985 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.102098 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.112015 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.130290 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.142939 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.160703 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.172949 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.183661 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.196972 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.225466 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.264208 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.320312 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.320659 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:18:27.320629902 +0000 UTC m=+28.508125761 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.321123 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.321279 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.321385 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:27.321370953 +0000 UTC m=+28.508866812 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.422258 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.422561 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.422719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.422562 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423079 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:27.423054086 +0000 UTC m=+28.610549965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.422628 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423307 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423413 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423532 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:27.423518091 +0000 UTC m=+28.611013960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.422917 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423729 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423825 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:23 crc kubenswrapper[4909]: E1002 18:18:23.423944 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:27.423930473 +0000 UTC m=+28.611426352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.819098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97"} Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.819764 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd"} Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.819922 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9"} Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.826497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerStarted","Data":"d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806"} Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.844344 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.862446 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.872879 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.888618 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.910335 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.929125 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.948961 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.964741 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:23 crc kubenswrapper[4909]: I1002 18:18:23.986536 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.001527 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.025810 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.042400 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.055174 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.607756 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.607905 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.607905 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:24 crc kubenswrapper[4909]: E1002 18:18:24.607938 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:24 crc kubenswrapper[4909]: E1002 18:18:24.608190 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:24 crc kubenswrapper[4909]: E1002 18:18:24.608436 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.831938 4909 generic.go:334] "Generic (PLEG): container finished" podID="100e4154-9795-41ea-8365-38ab076e57cd" containerID="d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806" exitCode=0 Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.832049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerDied","Data":"d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806"} Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.836902 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05"} Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.836945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543"} Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.836959 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982"} Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.851017 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.873975 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.888658 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.901981 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.916213 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.934303 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.954400 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.967398 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.981911 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:24 crc kubenswrapper[4909]: I1002 18:18:24.995314 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:24Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.011232 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.028898 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.042312 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.844167 4909 generic.go:334] "Generic (PLEG): container finished" podID="100e4154-9795-41ea-8365-38ab076e57cd" containerID="d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8" exitCode=0 Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.844281 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerDied","Data":"d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8"} Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.874432 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.894496 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.918361 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.937456 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.952588 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.968516 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:25 crc kubenswrapper[4909]: I1002 18:18:25.988394 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:25Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.007125 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.022883 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.040290 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.044616 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.047559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.047828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.048051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.049429 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.058621 4909 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.059063 4909 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.062309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.062419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.062551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.062629 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.062333 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.062687 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.077885 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.082462 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.086399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.086458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.086475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.086503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.086518 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.100659 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.107366 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.112514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.112558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.112571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.112590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.112601 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.124912 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.129450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.129492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.129502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.129520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.129531 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.143689 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.149234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.149288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.149303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.149324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.149340 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.161801 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.161919 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.163575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.163613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.163648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.163666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.163677 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.266821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.266863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.266871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.266885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.266894 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.369211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.369271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.369287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.369312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.369333 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.471272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.471315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.471327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.471344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.471358 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.575166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.575259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.575285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.575321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.575343 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.607586 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.607687 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.607732 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.607895 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.608105 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:26 crc kubenswrapper[4909]: E1002 18:18:26.608115 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.682923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.682990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.683016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.683095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.683124 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.786626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.786708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.786735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.786768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.786792 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.851437 4909 generic.go:334] "Generic (PLEG): container finished" podID="100e4154-9795-41ea-8365-38ab076e57cd" containerID="1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20" exitCode=0 Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.851493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerDied","Data":"1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.856385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.870869 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.890945 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.891197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.891343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.891368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.891399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.891423 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.901414 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.910426 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.937737 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.953094 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.976268 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.994923 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:26Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.995682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.995714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.995724 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.995740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:26 crc kubenswrapper[4909]: I1002 18:18:26.995753 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:26Z","lastTransitionTime":"2025-10-02T18:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.010841 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.024828 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.042200 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.062384 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.082961 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.097704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.097744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.097756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.097773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.097785 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.200781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.200836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.200853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.200876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.200893 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.304759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.304825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.304849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.304874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.304905 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.382801 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.382991 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:18:35.382962879 +0000 UTC m=+36.570458738 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.383113 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.383283 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.383340 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:35.383331471 +0000 UTC m=+36.570827330 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.407803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.407852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.407866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.407888 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.407902 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484107 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484152 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484174 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484248 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:35.484224999 +0000 UTC m=+36.671720898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.483893 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.484415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.484473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484642 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484667 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484687 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484724 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484751 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:35.484729094 +0000 UTC m=+36.672224993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:27 crc kubenswrapper[4909]: E1002 18:18:27.484782 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:35.484766096 +0000 UTC m=+36.672262005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.510902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.510964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.510977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.510997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.511009 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.614819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.614868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.614880 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.614895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.614907 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.717987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.718050 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.718061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.718078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.718089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.821726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.821796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.821820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.821851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.821875 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.864114 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerStarted","Data":"3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.883609 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.900582 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.919403 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.928600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.928648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.928663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.928692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.928708 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:27Z","lastTransitionTime":"2025-10-02T18:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.941456 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.970983 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:27 crc kubenswrapper[4909]: I1002 18:18:27.989175 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.008360 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.023298 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.032723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.032762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.032772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.032790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.032802 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.037650 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.052577 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.066206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.078778 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.091942 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.135087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.135150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.135168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.135195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.135217 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.238903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.238986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.239011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.239071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.239097 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.341765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.341846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.341871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.341905 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.341931 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.445446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.445551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.445566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.445585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.445598 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.549087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.549197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.549217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.549250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.549276 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.607582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.607644 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.607581 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:28 crc kubenswrapper[4909]: E1002 18:18:28.607754 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:28 crc kubenswrapper[4909]: E1002 18:18:28.607848 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:28 crc kubenswrapper[4909]: E1002 18:18:28.608057 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.652931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.652980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.652996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.653016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.653053 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.756246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.756295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.756310 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.756329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.756347 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.858819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.858904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.858930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.858965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.858997 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.874096 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.874445 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.880044 4909 generic.go:334] "Generic (PLEG): container finished" podID="100e4154-9795-41ea-8365-38ab076e57cd" containerID="3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a" exitCode=0 Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.880082 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerDied","Data":"3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.895337 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.910421 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.912289 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.941690 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.960267 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9j4dc"] Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.960211 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.960638 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.963603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.963637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.963646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.963658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.963682 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:28Z","lastTransitionTime":"2025-10-02T18:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.969612 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.970650 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.970743 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.976305 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.977337 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:28 crc kubenswrapper[4909]: I1002 18:18:28.995066 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.002967 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpsc8\" (UniqueName: \"kubernetes.io/projected/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-kube-api-access-vpsc8\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.003017 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-host\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.003063 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-serviceca\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.011468 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.024909 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.039507 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.057506 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.065787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.065850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.065867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.065894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.065912 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.078082 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.092880 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.103884 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.104121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpsc8\" (UniqueName: \"kubernetes.io/projected/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-kube-api-access-vpsc8\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.104213 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-host\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.104255 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-serviceca\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.104472 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-host\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.106781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-serviceca\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.124260 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpsc8\" (UniqueName: \"kubernetes.io/projected/c51cad9b-1ae8-4c5d-bfd8-dfc178843056-kube-api-access-vpsc8\") pod \"node-ca-9j4dc\" (UID: \"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\") " pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.124672 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.140929 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.156730 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.168769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.168832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.168850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.168876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.168895 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.173351 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.189995 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.203599 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.222527 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.237175 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.257652 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.271765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.271811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.271825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.271846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.271862 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.275002 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.287712 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9j4dc" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.297445 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.317776 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.336464 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.359272 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.374385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.374435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.374450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.374468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.374481 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.478010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.478104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.478117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.478136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.478149 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.582128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.582201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.582219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.582247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.582267 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.628187 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.659968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.681733 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.696560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.696595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.696606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.696643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.696656 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.705826 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.738207 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.754332 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.767964 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.778356 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.791475 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.799863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.799901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.799923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.799939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.799948 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.804143 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.815106 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.825937 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.835482 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.847275 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.886787 4909 generic.go:334] "Generic (PLEG): container finished" podID="100e4154-9795-41ea-8365-38ab076e57cd" containerID="1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085" exitCode=0 Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.886876 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerDied","Data":"1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.888048 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9j4dc" event={"ID":"c51cad9b-1ae8-4c5d-bfd8-dfc178843056","Type":"ContainerStarted","Data":"abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.888072 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9j4dc" event={"ID":"c51cad9b-1ae8-4c5d-bfd8-dfc178843056","Type":"ContainerStarted","Data":"c51a7c4c8c4f7e1737a3b1285ea7e7776326fa37f1a5f75ec8ee86b291a0cebf"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.888160 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.888628 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.902375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.902419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.902435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.902463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.902480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:29Z","lastTransitionTime":"2025-10-02T18:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.910766 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.919915 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.930232 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.946344 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.963719 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.975953 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:29 crc kubenswrapper[4909]: I1002 18:18:29.996523 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.005003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.005056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.005068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.005089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.005102 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.016546 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.036602 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.054999 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.114932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.114961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.114970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.114983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.114992 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.126437 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.154977 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.168931 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.193788 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.210665 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.218308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.218731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.218760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.218790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.218813 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.224870 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.235725 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.244882 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.259588 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.275258 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.287576 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.299910 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.311613 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.321153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.321201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.321212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.321228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.321241 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.322076 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.350552 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.369727 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.378584 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.388544 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.402196 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.423937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.423972 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.423980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.423993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.424002 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.527243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.527304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.527319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.527339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.527353 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.608119 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.608165 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.608226 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:30 crc kubenswrapper[4909]: E1002 18:18:30.608324 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:30 crc kubenswrapper[4909]: E1002 18:18:30.608525 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:30 crc kubenswrapper[4909]: E1002 18:18:30.608709 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.629976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.630019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.630076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.630102 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.630122 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.733424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.733492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.733511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.733537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.733555 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.836569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.836628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.836649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.836680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.836699 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.896534 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" event={"ID":"100e4154-9795-41ea-8365-38ab076e57cd","Type":"ContainerStarted","Data":"83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.896626 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.911218 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.929162 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.939403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.939453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.939470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.939493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.939512 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:30Z","lastTransitionTime":"2025-10-02T18:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.948296 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.970655 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:30 crc kubenswrapper[4909]: I1002 18:18:30.987977 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:30Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.008181 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.041599 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.042884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.043114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.043266 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.043404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.043532 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.064536 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.085115 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.103680 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.125386 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.141350 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.171810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.172143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.172393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.172681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.172925 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.174241 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.199462 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:31Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.276838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.276887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.276903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.276924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.276939 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.379762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.379803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.379816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.379835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.379851 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.482935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.482998 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.483019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.483090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.483115 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.585358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.585419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.585443 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.585469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.585492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.688622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.689330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.689530 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.689778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.689889 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.793139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.793202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.793220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.793245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.793263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.896930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.896984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.897000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.897023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.897082 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:31Z","lastTransitionTime":"2025-10-02T18:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:31 crc kubenswrapper[4909]: I1002 18:18:31.900086 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.000110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.000166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.000183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.000203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.000217 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.103304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.103346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.103357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.103374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.103388 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.205934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.206004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.206060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.206091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.206115 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.309480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.309523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.309536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.309553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.309566 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.412696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.412954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.413065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.413169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.413279 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.516600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.516669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.516693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.516723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.516746 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.607648 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.607698 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:32 crc kubenswrapper[4909]: E1002 18:18:32.607852 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.607677 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:32 crc kubenswrapper[4909]: E1002 18:18:32.608316 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:32 crc kubenswrapper[4909]: E1002 18:18:32.607981 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.620132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.620192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.620208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.620233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.620251 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.723937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.724017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.724072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.724107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.724129 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.827646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.827699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.827716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.827740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.827758 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.909630 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/0.log" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.915600 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5" exitCode=1 Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.915628 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.917098 4909 scope.go:117] "RemoveContainer" containerID="71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.930231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.930281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.930300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.930324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.930341 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:32Z","lastTransitionTime":"2025-10-02T18:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.939484 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.961586 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:32 crc kubenswrapper[4909]: I1002 18:18:32.985592 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.004763 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.026994 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.033225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.033263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.033276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.033291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.033306 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.049761 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.075947 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.083551 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.095287 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.127212 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:32.074303 6174 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 18:18:32.074398 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 18:18:32.074423 6174 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 18:18:32.074471 6174 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 18:18:32.074484 6174 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 18:18:32.074534 6174 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 18:18:32.074564 6174 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:32.074590 6174 factory.go:656] Stopping watch factory\\\\nI1002 18:18:32.074616 6174 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:32.074676 6174 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 18:18:32.074696 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 18:18:32.074708 6174 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 18:18:32.074720 6174 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 18:18:32.074732 6174 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 18:18:32.074743 6174 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.136270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.136333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.136349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.136374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.136393 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.145070 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.164148 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.182990 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.200836 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.214792 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.239412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.239465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.239481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.239502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.239520 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.346104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.346161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.346177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.346202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.346219 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.449771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.450137 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.450347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.450488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.450675 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.554442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.554516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.554540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.554573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.554596 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.656699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.656747 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.656759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.656777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.656809 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.759765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.759826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.759847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.759870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.759888 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.866602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.866661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.866680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.866703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.866721 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.922136 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/0.log" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.925394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.925879 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.946730 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.968960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.969115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.969152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.969167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.969186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.969203 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:33Z","lastTransitionTime":"2025-10-02T18:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.986816 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:33 crc kubenswrapper[4909]: I1002 18:18:33.999847 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.011926 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.024983 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.041947 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:32.074303 6174 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 18:18:32.074398 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 18:18:32.074423 6174 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 18:18:32.074471 6174 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 18:18:32.074484 6174 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 18:18:32.074534 6174 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 18:18:32.074564 6174 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:32.074590 6174 factory.go:656] Stopping watch factory\\\\nI1002 18:18:32.074616 6174 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:32.074676 6174 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 18:18:32.074696 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 18:18:32.074708 6174 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 18:18:32.074720 6174 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 18:18:32.074732 6174 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 18:18:32.074743 6174 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.055249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.067197 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.071305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.071347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.071359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.071379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.071391 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.079374 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.096368 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.114669 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.130779 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.143959 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.173277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.173317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.173327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.173340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.173350 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.276229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.276276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.276288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.276306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.276318 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.379249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.379318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.379329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.379352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.379366 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.482278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.482333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.482348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.482370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.482385 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.585687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.585743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.585761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.585783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.585798 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.608427 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.608528 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.608452 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:34 crc kubenswrapper[4909]: E1002 18:18:34.608855 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:34 crc kubenswrapper[4909]: E1002 18:18:34.609011 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:34 crc kubenswrapper[4909]: E1002 18:18:34.609172 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.674775 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl"] Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.675503 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.678404 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.678662 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.689940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.689979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.689996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.690052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.690080 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.696725 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.711818 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.733347 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.759636 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.776257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/115da42c-b1e9-470d-8734-a3331cdff421-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.776320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/115da42c-b1e9-470d-8734-a3331cdff421-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.776354 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25r4\" (UniqueName: \"kubernetes.io/projected/115da42c-b1e9-470d-8734-a3331cdff421-kube-api-access-m25r4\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.776391 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/115da42c-b1e9-470d-8734-a3331cdff421-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.780195 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.794599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.794651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.794663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.794697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.794717 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.797725 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.817088 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.832434 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.845221 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.857618 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.873551 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.877097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/115da42c-b1e9-470d-8734-a3331cdff421-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.877235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/115da42c-b1e9-470d-8734-a3331cdff421-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.877284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25r4\" (UniqueName: \"kubernetes.io/projected/115da42c-b1e9-470d-8734-a3331cdff421-kube-api-access-m25r4\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.877349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/115da42c-b1e9-470d-8734-a3331cdff421-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.878403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/115da42c-b1e9-470d-8734-a3331cdff421-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.878591 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/115da42c-b1e9-470d-8734-a3331cdff421-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.887596 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/115da42c-b1e9-470d-8734-a3331cdff421-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.894404 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:32.074303 6174 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 18:18:32.074398 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 18:18:32.074423 6174 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 18:18:32.074471 6174 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 18:18:32.074484 6174 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 18:18:32.074534 6174 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 18:18:32.074564 6174 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:32.074590 6174 factory.go:656] Stopping watch factory\\\\nI1002 18:18:32.074616 6174 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:32.074676 6174 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 18:18:32.074696 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 18:18:32.074708 6174 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 18:18:32.074720 6174 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 18:18:32.074732 6174 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 18:18:32.074743 6174 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.897602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.897638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.897649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.897666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.897682 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:34Z","lastTransitionTime":"2025-10-02T18:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.900778 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25r4\" (UniqueName: \"kubernetes.io/projected/115da42c-b1e9-470d-8734-a3331cdff421-kube-api-access-m25r4\") pod \"ovnkube-control-plane-749d76644c-7tlwl\" (UID: \"115da42c-b1e9-470d-8734-a3331cdff421\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.914889 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.930702 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/1.log" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.932174 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/0.log" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.935683 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460" exitCode=1 Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.935823 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460"} Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.936199 4909 scope.go:117] "RemoveContainer" containerID="71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.936969 4909 scope.go:117] "RemoveContainer" containerID="86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460" Oct 02 18:18:34 crc kubenswrapper[4909]: E1002 18:18:34.937173 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.939636 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.955571 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.979475 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.996904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" Oct 02 18:18:34 crc kubenswrapper[4909]: I1002 18:18:34.998126 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:34Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.000392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.000464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.000481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.000504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.000522 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.015968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: W1002 18:18:35.016375 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115da42c_b1e9_470d_8734_a3331cdff421.slice/crio-e004c90f3e32f7bb53e6ae93fb537e2f381d2427228d3b9e3ab19aea3073ecb0 WatchSource:0}: Error finding container e004c90f3e32f7bb53e6ae93fb537e2f381d2427228d3b9e3ab19aea3073ecb0: Status 404 returned error can't find the container with id e004c90f3e32f7bb53e6ae93fb537e2f381d2427228d3b9e3ab19aea3073ecb0 Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.034344 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.053187 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.070141 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.083801 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.103704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.104161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.104328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.104058 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.104464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.104686 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.122187 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.136011 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.148265 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.177456 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:32.074303 6174 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 18:18:32.074398 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 18:18:32.074423 6174 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 18:18:32.074471 6174 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 18:18:32.074484 6174 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 18:18:32.074534 6174 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 18:18:32.074564 6174 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:32.074590 6174 factory.go:656] Stopping watch factory\\\\nI1002 18:18:32.074616 6174 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:32.074676 6174 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 18:18:32.074696 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 18:18:32.074708 6174 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 18:18:32.074720 6174 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 18:18:32.074732 6174 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 18:18:32.074743 6174 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.200308 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.207900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.207932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.207943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.207959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.207973 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.216171 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.231937 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.310983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.311069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.311090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.311117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.311134 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.386180 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.386439 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:18:51.386407951 +0000 UTC m=+52.573903850 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.386567 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.386784 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.386925 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:51.386893796 +0000 UTC m=+52.574389695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.414088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.414157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.414174 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.414200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.414218 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.418260 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wd57x"] Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.418956 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.419088 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.437927 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.460114 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.479916 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.488008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.488095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.488127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.488157 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9n9d\" (UniqueName: \"kubernetes.io/projected/ccda21fa-5211-460d-b521-fc5c86673b73-kube-api-access-k9n9d\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.488183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488276 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488300 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488301 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488326 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488368 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488409 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:51.488382133 +0000 UTC m=+52.675878032 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488460 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:51.488426124 +0000 UTC m=+52.675922033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488328 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488507 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.488568 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:51.488550147 +0000 UTC m=+52.676046156 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.506450 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.517489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.517554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.517598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.517625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.517643 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.561738 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.588708 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.588749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9n9d\" (UniqueName: \"kubernetes.io/projected/ccda21fa-5211-460d-b521-fc5c86673b73-kube-api-access-k9n9d\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.588915 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.588996 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:36.088978981 +0000 UTC m=+37.276474840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.590442 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.604561 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.606586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9n9d\" (UniqueName: \"kubernetes.io/projected/ccda21fa-5211-460d-b521-fc5c86673b73-kube-api-access-k9n9d\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.616411 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.619642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.619667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.619675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.619688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.619699 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.634280 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71c56a2dc1f547a6e9c81eb9fe7460954b114ddba384e42a6437a43295f1c7c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:32.074303 6174 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 18:18:32.074398 6174 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 18:18:32.074423 6174 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 18:18:32.074471 6174 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 18:18:32.074484 6174 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 18:18:32.074534 6174 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 18:18:32.074564 6174 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:32.074590 6174 factory.go:656] Stopping watch factory\\\\nI1002 18:18:32.074616 6174 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:32.074676 6174 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 18:18:32.074696 6174 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 18:18:32.074708 6174 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 18:18:32.074720 6174 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 18:18:32.074732 6174 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 18:18:32.074743 6174 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.648150 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.663669 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.674245 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.685209 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.697678 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.709606 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.721800 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.722619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.722678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.722698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.722726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.722745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.826205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.826265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.826282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.826308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.826325 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.930091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.930152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.930170 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.930195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.930224 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:35Z","lastTransitionTime":"2025-10-02T18:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.942174 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/1.log" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.949871 4909 scope.go:117] "RemoveContainer" containerID="86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460" Oct 02 18:18:35 crc kubenswrapper[4909]: E1002 18:18:35.950146 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.952198 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" event={"ID":"115da42c-b1e9-470d-8734-a3331cdff421","Type":"ContainerStarted","Data":"090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.952251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" event={"ID":"115da42c-b1e9-470d-8734-a3331cdff421","Type":"ContainerStarted","Data":"a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.952274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" event={"ID":"115da42c-b1e9-470d-8734-a3331cdff421","Type":"ContainerStarted","Data":"e004c90f3e32f7bb53e6ae93fb537e2f381d2427228d3b9e3ab19aea3073ecb0"} Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.968205 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:35 crc kubenswrapper[4909]: I1002 18:18:35.988682 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:35Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.010317 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.032960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.033683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.033783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.033806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.033832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.033851 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.052267 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.073946 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.093099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.093288 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.093383 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:37.09335687 +0000 UTC m=+38.280852769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.096069 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.112079 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.134366 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.137345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.137405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.137418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.137438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.137455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.152380 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.171422 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.178002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.178105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.178126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.178151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.178204 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.194818 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.198860 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.204360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.204449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.204474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.204502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.204524 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.214372 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.225236 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.229880 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.229965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.229985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.230010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.230064 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.233005 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.252135 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.256637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.256753 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.256772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.256795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.256813 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.265176 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.277747 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.281971 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.282012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.282041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.282061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.282076 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.288497 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.303608 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.304568 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.306201 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.306510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.306564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.306583 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.306610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.306630 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.321176 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.340252 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.364133 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.381569 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.404373 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.409087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.409147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.409165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.409191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.409209 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.420703 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.440445 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.460380 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.488786 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.512569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.512616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.512631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.512652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.512668 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.514402 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.530597 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.546864 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.565856 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.584736 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.596586 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.607967 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.607995 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.608005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.608136 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.608164 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.608303 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.608438 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:36 crc kubenswrapper[4909]: E1002 18:18:36.608522 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.614734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.614773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.614791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.614810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.614823 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.719214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.719274 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.719291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.719313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.719333 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.821869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.822382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.822410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.822440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.822463 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.925848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.925918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.925944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.925975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:36 crc kubenswrapper[4909]: I1002 18:18:36.925992 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:36Z","lastTransitionTime":"2025-10-02T18:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.029794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.029874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.029892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.029917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.029941 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.103877 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:37 crc kubenswrapper[4909]: E1002 18:18:37.104110 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:37 crc kubenswrapper[4909]: E1002 18:18:37.104221 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:39.10419488 +0000 UTC m=+40.291690769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.133781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.133850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.133869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.133902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.133919 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.237765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.237836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.237853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.237879 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.237900 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.340862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.340930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.340950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.340983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.341021 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.444176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.444240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.444257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.444282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.444299 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.547784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.547893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.547919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.547953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.547976 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.651921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.652069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.652090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.652114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.652158 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.754815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.754887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.754909 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.754934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.754951 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.858380 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.858453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.858475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.858506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.858527 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.961456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.961604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.961635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.961655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:37 crc kubenswrapper[4909]: I1002 18:18:37.961668 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:37Z","lastTransitionTime":"2025-10-02T18:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.064769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.064848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.064868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.064911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.064936 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.168198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.168245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.168260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.168281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.168296 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.272624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.272693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.272715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.272747 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.272773 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.375482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.375533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.375549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.375569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.375584 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.478626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.478670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.478687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.478706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.478724 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.581994 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.582099 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.582125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.582157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.582179 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.607438 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.607438 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.607438 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.607908 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:38 crc kubenswrapper[4909]: E1002 18:18:38.608141 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:38 crc kubenswrapper[4909]: E1002 18:18:38.608314 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:38 crc kubenswrapper[4909]: E1002 18:18:38.608446 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:38 crc kubenswrapper[4909]: E1002 18:18:38.608532 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.684957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.685068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.685091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.685120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.685143 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.788456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.788813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.788996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.789298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.789354 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.892109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.892173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.892191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.892219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.892238 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.994465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.994577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.994590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.994606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:38 crc kubenswrapper[4909]: I1002 18:18:38.994619 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:38Z","lastTransitionTime":"2025-10-02T18:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.097239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.097295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.097309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.097334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.097348 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.127445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:39 crc kubenswrapper[4909]: E1002 18:18:39.127681 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:39 crc kubenswrapper[4909]: E1002 18:18:39.127822 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:43.127780801 +0000 UTC m=+44.315276710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.199903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.199970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.199988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.200012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.200068 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.303355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.303427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.303444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.303466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.303483 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.406409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.406460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.406483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.406510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.406532 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.509750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.509786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.509796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.509811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.509823 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.613205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.613260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.613289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.613312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.613329 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.628838 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.651572 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.669606 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.689460 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.707082 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.721536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.722224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.722374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.722398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.722454 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.726359 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.744409 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.764777 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.786175 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.806160 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.825057 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.825289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.825406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.825500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.825577 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.862453 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.894337 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.916254 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.929182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.929230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.929242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.929263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.929276 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:39Z","lastTransitionTime":"2025-10-02T18:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.932833 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.949366 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:39 crc kubenswrapper[4909]: I1002 18:18:39.967223 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.032339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.032537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.032619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.032749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.032888 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.136501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.137562 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.137736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.137879 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.138073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.240866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.241178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.241317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.241439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.241549 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.345494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.345561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.345577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.345601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.345619 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.448238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.448487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.448657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.448986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.450276 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.553632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.553916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.554091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.554305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.554468 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.608346 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.608384 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.608875 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:40 crc kubenswrapper[4909]: E1002 18:18:40.609069 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.609095 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:40 crc kubenswrapper[4909]: E1002 18:18:40.609293 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:40 crc kubenswrapper[4909]: E1002 18:18:40.609546 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:40 crc kubenswrapper[4909]: E1002 18:18:40.609676 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.657010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.657077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.657088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.657105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.657117 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.760289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.760703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.760886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.761089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.761234 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.864532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.864620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.864644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.864673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.864696 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.968223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.968288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.968304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.968330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:40 crc kubenswrapper[4909]: I1002 18:18:40.968348 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:40Z","lastTransitionTime":"2025-10-02T18:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.072117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.072178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.072196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.072225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.072244 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.175447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.175544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.175564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.175593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.175611 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.279554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.279624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.279644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.279669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.279687 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.383700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.383778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.383802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.383829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.383848 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.488858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.488965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.489066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.489097 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.489115 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.592914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.592966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.592984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.593008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.593052 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.700227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.700310 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.700333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.700365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.700388 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.803768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.803859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.803876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.803901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.803920 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.907111 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.907200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.907231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.907259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:41 crc kubenswrapper[4909]: I1002 18:18:41.907277 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:41Z","lastTransitionTime":"2025-10-02T18:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.010575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.010631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.010648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.010670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.010688 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.113791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.113845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.113863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.113885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.113903 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.217231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.217315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.217339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.217365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.217384 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.320382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.320454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.320480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.320514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.320537 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.423955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.424019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.424167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.424195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.424213 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.527156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.527501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.527593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.527691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.527784 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.608337 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:42 crc kubenswrapper[4909]: E1002 18:18:42.608556 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.608751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.608786 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:42 crc kubenswrapper[4909]: E1002 18:18:42.609011 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:42 crc kubenswrapper[4909]: E1002 18:18:42.609105 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.609544 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:42 crc kubenswrapper[4909]: E1002 18:18:42.609914 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.631523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.631568 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.631599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.631619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.631633 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.734300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.734382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.734403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.734436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.734455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.837967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.838099 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.838130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.838165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.838188 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.941600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.941662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.941683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.941708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:42 crc kubenswrapper[4909]: I1002 18:18:42.941726 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:42Z","lastTransitionTime":"2025-10-02T18:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.046013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.046144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.046190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.046221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.046238 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.149669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.150177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.150555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.150770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.150924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.172692 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:43 crc kubenswrapper[4909]: E1002 18:18:43.172933 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:43 crc kubenswrapper[4909]: E1002 18:18:43.173124 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:18:51.173091544 +0000 UTC m=+52.360587443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.254465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.254557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.254593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.254635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.254663 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.362366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.362459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.362488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.362524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.362559 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.465958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.466016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.466080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.466106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.466123 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.569498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.569561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.569577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.569601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.569618 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.672650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.672705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.672721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.672743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.672760 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.775807 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.775859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.775876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.775902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.775922 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.878804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.878865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.878883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.878907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.878925 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.981256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.981289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.981301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.981316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:43 crc kubenswrapper[4909]: I1002 18:18:43.981326 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:43Z","lastTransitionTime":"2025-10-02T18:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.084758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.084861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.084887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.084919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.084945 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.187758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.187810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.187822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.187841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.187857 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.290260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.290307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.290319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.290339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.290350 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.393532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.393596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.393610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.393634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.393649 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.496705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.496754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.496767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.496786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.496801 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.598823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.598879 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.598893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.598911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.598924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.607432 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.607480 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.607521 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.607459 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:44 crc kubenswrapper[4909]: E1002 18:18:44.607642 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:44 crc kubenswrapper[4909]: E1002 18:18:44.607738 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:44 crc kubenswrapper[4909]: E1002 18:18:44.607874 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:44 crc kubenswrapper[4909]: E1002 18:18:44.608138 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.702285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.702664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.702814 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.702965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.703190 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.805439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.805493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.805513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.805537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.805612 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.908170 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.908452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.908644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.908820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:44 crc kubenswrapper[4909]: I1002 18:18:44.908991 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:44Z","lastTransitionTime":"2025-10-02T18:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.011494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.011752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.011831 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.011910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.011980 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.114879 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.115353 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.115525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.115689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.115817 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.218387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.218792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.218943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.219157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.219298 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.322512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.323150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.323184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.323215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.323233 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.425902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.426346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.426538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.426726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.426860 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.530215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.530274 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.530292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.530317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.530336 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.633546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.633653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.633683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.633725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.633768 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.737895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.737991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.738009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.738056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.738074 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.841172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.841302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.841333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.841376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.841402 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.944526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.944608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.944628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.944656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:45 crc kubenswrapper[4909]: I1002 18:18:45.944673 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:45Z","lastTransitionTime":"2025-10-02T18:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.047150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.047196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.047213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.047229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.047240 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.149770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.150246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.150416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.150550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.150729 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.254357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.254674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.254919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.255154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.255362 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.359089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.359152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.359171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.359195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.359213 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.409479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.409541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.409559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.409589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.409611 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.430774 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:46Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.436168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.436223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.436248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.436275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.436298 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.457819 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:46Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.462386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.462435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.462453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.462476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.462493 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.483411 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:46Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.487921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.487998 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.488017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.488063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.488080 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.503180 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:46Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.508897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.508954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.508970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.508993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.509011 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.525377 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:46Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.525615 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.528474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.528528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.528551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.528581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.528605 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.607802 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.607846 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.607819 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.607985 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.608160 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.608326 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.608644 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:46 crc kubenswrapper[4909]: E1002 18:18:46.608902 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.632085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.632394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.632528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.632653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.632804 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.747110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.747178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.747197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.747401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.747419 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.850522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.850577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.850594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.850618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.850639 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.954210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.954265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.954276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.954295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:46 crc kubenswrapper[4909]: I1002 18:18:46.954308 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:46Z","lastTransitionTime":"2025-10-02T18:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.056882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.056955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.056979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.057003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.057019 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.160011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.160134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.160153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.160179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.160198 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.263225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.263272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.263284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.263302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.263315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.366262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.366305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.366315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.366333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.366344 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.469786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.469830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.469841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.469858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.469870 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.572964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.573012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.573045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.573068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.573084 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.608870 4909 scope.go:117] "RemoveContainer" containerID="86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.675987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.676053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.676065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.676085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.676099 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.779535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.779591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.779609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.779636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.779656 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.882372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.882426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.882438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.882458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.882471 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.985732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.985779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.985790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.985809 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:47 crc kubenswrapper[4909]: I1002 18:18:47.985821 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:47Z","lastTransitionTime":"2025-10-02T18:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.000167 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/1.log" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.004130 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.004804 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.029399 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.055826 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.072825 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.089098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.089153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.089182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.089207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.089224 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.090851 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.106779 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.124856 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.141961 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.160707 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.179594 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.192001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.192085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.192098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.192120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.192135 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.192163 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.206544 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.227872 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.245652 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.264697 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.285132 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.294757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.294835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.294850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.294892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.294911 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.304276 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.356322 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.366232 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.379619 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.393134 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.397651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.397834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.397907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.397983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.398072 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.406483 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.418368 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.447467 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.465494 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.480557 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.495360 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.500444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.500505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.500517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.500535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.500546 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.511376 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.528700 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.546961 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.560217 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.583492 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.594436 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.602892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.602957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.602974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.602997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.603014 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.606682 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.607825 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.607824 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.608001 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.608058 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:48 crc kubenswrapper[4909]: E1002 18:18:48.608391 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:48 crc kubenswrapper[4909]: E1002 18:18:48.608489 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:48 crc kubenswrapper[4909]: E1002 18:18:48.608686 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:48 crc kubenswrapper[4909]: E1002 18:18:48.608880 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.631706 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:48Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.705645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.705925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.706012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.706130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.706200 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.808894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.809211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.809344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.809464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.809590 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.912271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.912332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.912351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.912377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:48 crc kubenswrapper[4909]: I1002 18:18:48.912395 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:48Z","lastTransitionTime":"2025-10-02T18:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.010470 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/2.log" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.011748 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/1.log" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.014573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.014616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.014635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.014658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.014675 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.016308 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0" exitCode=1 Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.016459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.016526 4909 scope.go:117] "RemoveContainer" containerID="86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.017469 4909 scope.go:117] "RemoveContainer" containerID="5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0" Oct 02 18:18:49 crc kubenswrapper[4909]: E1002 18:18:49.017674 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.045305 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.065711 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.085449 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.101895 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.117702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.117759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.117775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.117801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.117820 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.120847 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.138018 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.168533 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.191281 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.212775 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.220245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.220310 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.220331 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.220358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.220376 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.232516 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.255063 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.282374 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.298466 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.317899 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.322643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.322797 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.322874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.322941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.323002 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.334361 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.349244 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.361891 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.425700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.426281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.426400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.426489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.426563 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.529514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.529566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.529582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.529603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.529620 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.624362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.632486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.632545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.632563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.632587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.632605 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.644590 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.658928 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.678978 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.697857 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.730189 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86d122c8d213042ff2792b68e9e1151a073f702fcb1d5089f5283281ad439460\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405530 6375 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.405583 6375 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 18:18:34.405635 6375 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 18:18:34.405957 6375 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 18:18:34.406464 6375 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 18:18:34.406491 6375 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 18:18:34.406510 6375 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 18:18:34.406562 6375 factory.go:656] Stopping watch factory\\\\nI1002 18:18:34.406587 6375 ovnkube.go:599] Stopped ovnkube\\\\nI1002 18:18:34.406648 6375 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 18:18:34.406668 6375 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 18:18:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.735499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.735546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.735564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.735585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.735602 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.747390 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.761775 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.775351 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.791819 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.813665 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.828373 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.838176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.838211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.838222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.838236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.838248 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.843378 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.858813 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.877995 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.890725 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.903093 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.940185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.940288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.940302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.940320 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:49 crc kubenswrapper[4909]: I1002 18:18:49.940332 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:49Z","lastTransitionTime":"2025-10-02T18:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.024789 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/2.log" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.034496 4909 scope.go:117] "RemoveContainer" containerID="5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0" Oct 02 18:18:50 crc kubenswrapper[4909]: E1002 18:18:50.034758 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.044172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.044279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.044303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.044336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.044359 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.060132 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.086616 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.099209 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.112141 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.126608 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.140495 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.149298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.149352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.149371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.149396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.149414 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.159186 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.172109 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.190620 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.207884 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.223993 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.245086 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.252142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.252219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.252233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.252253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.252663 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.260963 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.274780 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.289365 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.302418 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.320435 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:50Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.355928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.355982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.355994 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.356015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.356325 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.459313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.460078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.460124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.460149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.460168 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.563263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.563303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.563312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.563329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.563339 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.608473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:50 crc kubenswrapper[4909]: E1002 18:18:50.608661 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.609157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.609263 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:50 crc kubenswrapper[4909]: E1002 18:18:50.609366 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.609263 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:50 crc kubenswrapper[4909]: E1002 18:18:50.609460 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:50 crc kubenswrapper[4909]: E1002 18:18:50.609537 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.665841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.665887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.665904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.665924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.665940 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.768516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.768597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.768619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.768646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.768665 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.871004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.871096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.871107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.871122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.871134 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.974899 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.975013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.975051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.975072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:50 crc kubenswrapper[4909]: I1002 18:18:50.975084 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:50Z","lastTransitionTime":"2025-10-02T18:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.077868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.077925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.077941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.077966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.077984 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.181852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.181934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.181959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.181996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.182021 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.196396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.196608 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.196713 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:19:07.196681286 +0000 UTC m=+68.384177185 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.285958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.286294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.286343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.286368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.286684 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.389989 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.390057 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.390069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.390087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.390099 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.398514 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.398753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.398898 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.398957 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:19:23.398939118 +0000 UTC m=+84.586434987 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.399072 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:19:23.398989979 +0000 UTC m=+84.586485878 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.493925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.493993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.494011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.494087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.494112 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.499922 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.499984 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.500087 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500235 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500236 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500281 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500284 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500302 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500315 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:19:23.50029419 +0000 UTC m=+84.687790089 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500322 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500346 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500385 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:19:23.500357512 +0000 UTC m=+84.687853411 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:51 crc kubenswrapper[4909]: E1002 18:18:51.500413 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:19:23.500401224 +0000 UTC m=+84.687897123 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.597591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.597671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.597694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.597721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.597739 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.700784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.700836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.700854 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.700877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.700895 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.804245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.804546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.804794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.804961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.805138 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.908932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.908990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.909007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.909058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:51 crc kubenswrapper[4909]: I1002 18:18:51.909080 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:51Z","lastTransitionTime":"2025-10-02T18:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.011996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.012373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.012526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.012676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.012842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.116115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.116169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.116186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.116209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.116227 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.219430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.219482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.219500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.219525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.219542 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.323301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.323359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.323376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.323404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.323445 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.426546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.426898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.427099 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.427331 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.427520 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.530953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.531008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.531055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.531079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.531095 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.608344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.608357 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:52 crc kubenswrapper[4909]: E1002 18:18:52.608565 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.608396 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.608377 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:52 crc kubenswrapper[4909]: E1002 18:18:52.608685 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:52 crc kubenswrapper[4909]: E1002 18:18:52.608797 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:52 crc kubenswrapper[4909]: E1002 18:18:52.608902 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.633475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.633760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.633894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.634073 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.634245 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.737118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.737168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.737185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.737211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.737231 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.840632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.840713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.840741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.840781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.840807 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.944391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.944499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.944557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.944582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:52 crc kubenswrapper[4909]: I1002 18:18:52.944599 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:52Z","lastTransitionTime":"2025-10-02T18:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.047487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.047537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.047552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.047573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.047588 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.150144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.150188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.150204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.150226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.150243 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.253576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.253646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.253665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.253694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.253711 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.356577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.356641 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.356719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.356794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.356825 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.459999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.460117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.460143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.460176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.460203 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.563427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.563501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.563523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.563549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.563567 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.666316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.666384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.666403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.666425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.666443 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.769071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.769121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.769137 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.769159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.769175 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.875421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.875817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.875851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.875876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.875902 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.980588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.980648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.980664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.980689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:53 crc kubenswrapper[4909]: I1002 18:18:53.980707 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:53Z","lastTransitionTime":"2025-10-02T18:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.083971 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.084139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.084159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.084183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.084202 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.186784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.186862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.186887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.186918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.186946 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.290088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.290146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.290164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.290190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.290209 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.392855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.392906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.392921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.392940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.392956 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.495811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.495870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.495891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.495919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.495941 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.598736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.598855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.598874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.598898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.598915 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.607329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.607405 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.607472 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.607329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:54 crc kubenswrapper[4909]: E1002 18:18:54.607544 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:54 crc kubenswrapper[4909]: E1002 18:18:54.607710 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:54 crc kubenswrapper[4909]: E1002 18:18:54.607828 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:54 crc kubenswrapper[4909]: E1002 18:18:54.608010 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.701987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.702078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.702097 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.702121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.702137 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.804851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.804935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.804965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.804993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.805013 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.907656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.907704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.907720 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.907744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:54 crc kubenswrapper[4909]: I1002 18:18:54.907760 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:54Z","lastTransitionTime":"2025-10-02T18:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.011149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.011183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.011193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.011209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.011220 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.113674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.113699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.113835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.113934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.113944 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.217324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.217393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.217420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.217448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.217467 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.320547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.320601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.320619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.320642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.320659 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.424502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.424559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.424581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.424612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.424635 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.527104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.527146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.527158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.527171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.527181 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.629510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.629584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.629604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.629628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.629645 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.732824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.732906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.732931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.732963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.732989 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.836273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.836330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.836347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.836370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.836387 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.938738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.938801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.938823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.938851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:55 crc kubenswrapper[4909]: I1002 18:18:55.938915 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:55Z","lastTransitionTime":"2025-10-02T18:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.041867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.041929 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.041945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.041969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.041985 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.145390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.145451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.145463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.145485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.145499 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.247932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.247983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.247999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.248022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.248075 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.351483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.352021 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.352303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.352513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.352704 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.455646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.455697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.455710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.455729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.455743 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.558183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.558243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.558264 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.558294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.558316 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.608291 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.608660 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.608346 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.609011 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.608291 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.608350 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.609619 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.609806 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.661785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.661842 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.661861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.661886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.661904 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.764828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.764886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.764903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.764927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.764945 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.867358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.867397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.867409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.867425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.867438 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.912649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.912728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.912750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.912776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.912795 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.929217 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:56Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.934425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.934472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.934484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.934500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.934513 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.953346 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:56Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.958117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.958181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.958242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.958275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.958297 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.976547 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:56Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.980831 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.980884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.980897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.980940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:56 crc kubenswrapper[4909]: I1002 18:18:56.980958 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:56Z","lastTransitionTime":"2025-10-02T18:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:56 crc kubenswrapper[4909]: E1002 18:18:56.997972 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:56Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.003126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.003166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.003178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.003194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.003207 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: E1002 18:18:57.021089 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:57Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:57 crc kubenswrapper[4909]: E1002 18:18:57.021314 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.023590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.023652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.023678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.023716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.023738 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.126859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.126925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.126949 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.126980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.127002 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.229425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.229497 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.229521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.229553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.229574 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.333378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.333435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.333452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.333478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.333496 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.436504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.436546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.436557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.436570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.436591 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.539672 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.539744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.539766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.539796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.539819 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.642478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.642550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.642569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.642592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.642609 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.745923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.746942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.747139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.747292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.747433 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.850903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.850969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.850993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.851061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.851091 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.954069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.954150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.954172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.954205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:57 crc kubenswrapper[4909]: I1002 18:18:57.954227 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:57Z","lastTransitionTime":"2025-10-02T18:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.057235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.057617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.057793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.057923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.058091 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.162168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.162561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.162719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.162880 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.163067 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.266227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.266302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.266326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.266356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.266379 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.368781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.368854 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.368873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.368900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.368923 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.471669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.471726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.471746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.471771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.471788 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.574374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.574468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.574486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.574512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.574534 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.608292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.608358 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.608362 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.608294 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:18:58 crc kubenswrapper[4909]: E1002 18:18:58.608497 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:18:58 crc kubenswrapper[4909]: E1002 18:18:58.608651 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:18:58 crc kubenswrapper[4909]: E1002 18:18:58.608801 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:18:58 crc kubenswrapper[4909]: E1002 18:18:58.608927 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.678309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.678372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.678389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.678414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.678431 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.782717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.782771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.782787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.782812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.782829 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.885988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.886080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.886104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.886132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.886169 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.988635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.988698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.988715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.988741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:58 crc kubenswrapper[4909]: I1002 18:18:58.988759 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:58Z","lastTransitionTime":"2025-10-02T18:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.090735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.090800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.090817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.090841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.090860 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.193349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.193418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.193435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.193462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.193480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.296780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.297205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.297357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.297500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.297642 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.401313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.401699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.401917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.402253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.402484 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.505359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.505675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.505981 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.506426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.506731 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.609184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.609583 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.609667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.609757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.609854 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.621670 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.644243 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.657114 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.674206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.693088 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.713271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.713325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.713341 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.713365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.713397 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.713726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.734370 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.755218 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.769575 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.787174 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.802953 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.816508 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.816812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.816963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.817112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.817232 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.822596 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.841187 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.871130 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.891071 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.913353 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.919793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.919845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.919865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.919890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.919909 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:18:59Z","lastTransitionTime":"2025-10-02T18:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:18:59 crc kubenswrapper[4909]: I1002 18:18:59.935156 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.023216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.023287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.023307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.023335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.023355 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.126794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.126874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.126932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.126957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.126975 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.229793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.229839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.229861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.229887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.229904 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.333140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.333204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.333227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.333257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.333279 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.436169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.436244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.436269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.436300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.436322 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.538977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.539038 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.539051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.539064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.539073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.607446 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.607494 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.607526 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.607450 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:00 crc kubenswrapper[4909]: E1002 18:19:00.607571 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:00 crc kubenswrapper[4909]: E1002 18:19:00.607656 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:00 crc kubenswrapper[4909]: E1002 18:19:00.607732 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:00 crc kubenswrapper[4909]: E1002 18:19:00.607962 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.642281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.642324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.642333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.642350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.642360 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.744927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.745007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.745072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.745107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.745130 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.847768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.847828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.847845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.847873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.847892 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.951204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.951281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.951307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.951339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:00 crc kubenswrapper[4909]: I1002 18:19:00.951444 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:00Z","lastTransitionTime":"2025-10-02T18:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.053867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.053921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.053938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.053962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.053980 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.157379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.157437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.157447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.157469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.157481 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.259996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.260042 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.260052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.260065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.260078 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.363205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.363262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.363279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.363329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.363346 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.466092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.466152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.466186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.466207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.466219 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.568455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.568509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.568525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.568546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.568570 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.671402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.671460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.671484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.671511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.671533 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.774335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.774399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.774416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.774442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.774465 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.877478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.877546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.877563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.877587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.877605 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.981161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.981236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.981258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.981287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:01 crc kubenswrapper[4909]: I1002 18:19:01.981313 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:01Z","lastTransitionTime":"2025-10-02T18:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.088030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.088134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.088160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.088183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.088195 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.191133 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.191173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.191182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.191200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.191210 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.294448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.294489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.294498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.294513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.294523 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.397393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.397432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.397440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.397456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.397467 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.500246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.500307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.500324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.500349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.500378 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.603193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.603241 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.603259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.603284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.603303 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.608889 4909 scope.go:117] "RemoveContainer" containerID="5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0" Oct 02 18:19:02 crc kubenswrapper[4909]: E1002 18:19:02.609200 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.609582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:02 crc kubenswrapper[4909]: E1002 18:19:02.609777 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.610299 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:02 crc kubenswrapper[4909]: E1002 18:19:02.610445 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.610712 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:02 crc kubenswrapper[4909]: E1002 18:19:02.610852 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.612091 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:02 crc kubenswrapper[4909]: E1002 18:19:02.612231 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.706712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.706815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.706840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.706908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.706933 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.809428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.809497 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.809515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.809539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.809560 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.913085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.913311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.913330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.913357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:02 crc kubenswrapper[4909]: I1002 18:19:02.913388 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:02Z","lastTransitionTime":"2025-10-02T18:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.016429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.016525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.016541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.016561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.016597 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.123947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.124081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.124102 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.124127 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.124181 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.227333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.227397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.227410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.227445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.227457 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.331175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.331241 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.331259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.331286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.331312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.435611 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.435652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.435662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.435677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.435687 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.538357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.538409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.538419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.538437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.538448 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.641584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.641650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.641670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.641696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.641714 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.744879 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.745172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.745273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.745379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.745475 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.847875 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.847959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.847981 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.848019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.848071 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.951348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.951438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.951457 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.951480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:03 crc kubenswrapper[4909]: I1002 18:19:03.951497 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:03Z","lastTransitionTime":"2025-10-02T18:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.054330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.054756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.054915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.055106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.055271 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.157970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.158351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.158502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.158630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.158764 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.261604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.261653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.261664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.261680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.261691 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.364993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.365076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.365095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.365119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.365140 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.467441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.467501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.467512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.467529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.467543 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.569745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.569961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.570081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.570246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.570311 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.607393 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.607435 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:04 crc kubenswrapper[4909]: E1002 18:19:04.607554 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.607999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:04 crc kubenswrapper[4909]: E1002 18:19:04.608064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.608117 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:04 crc kubenswrapper[4909]: E1002 18:19:04.608385 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:04 crc kubenswrapper[4909]: E1002 18:19:04.608532 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.620049 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.672197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.672231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.672242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.672255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.672266 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.774114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.774190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.774205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.774219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.774229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.876714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.876877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.876947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.877064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.877167 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.979830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.979898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.979918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.979943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:04 crc kubenswrapper[4909]: I1002 18:19:04.979965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:04Z","lastTransitionTime":"2025-10-02T18:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.082788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.082848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.082867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.082893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.082913 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.186482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.186532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.186545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.186566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.186579 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.289567 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.289624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.289643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.289670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.289688 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.391989 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.392055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.392072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.392089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.392100 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.496169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.496224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.496237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.496255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.496266 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.599217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.599277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.599293 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.599311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.599326 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.702360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.702432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.702445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.702464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.702476 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.804369 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.804419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.804428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.804446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.804455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.906836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.906904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.906916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.906936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:05 crc kubenswrapper[4909]: I1002 18:19:05.906952 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:05Z","lastTransitionTime":"2025-10-02T18:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.009574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.009631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.009650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.009681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.009723 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.112379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.112417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.112428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.112444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.112454 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.215376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.215454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.215474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.215506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.215525 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.318187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.318237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.318257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.318284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.318301 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.421471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.421526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.421541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.421566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.421581 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.524170 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.524231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.524243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.524259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.524270 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.608133 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.608181 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.608218 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.608151 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:06 crc kubenswrapper[4909]: E1002 18:19:06.608314 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:06 crc kubenswrapper[4909]: E1002 18:19:06.608447 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:06 crc kubenswrapper[4909]: E1002 18:19:06.608559 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:06 crc kubenswrapper[4909]: E1002 18:19:06.608633 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.627246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.627285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.627304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.627326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.627342 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.729817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.729857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.729872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.729887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.729900 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.832514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.832559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.832570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.832586 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.832598 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.935356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.935454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.935469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.935489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:06 crc kubenswrapper[4909]: I1002 18:19:06.935501 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:06Z","lastTransitionTime":"2025-10-02T18:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.039141 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.039247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.039265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.039288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.039307 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.141304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.141377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.141390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.141411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.141424 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.148887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.148921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.148930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.148944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.148955 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.168299 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.173509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.173551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.173560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.173575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.173584 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.194148 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.198268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.198318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.198331 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.198347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.198359 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.215806 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.220095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.220152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.220167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.220187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.220200 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.240069 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.244955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.244976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.244986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.245000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.245012 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.258530 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:07Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.258672 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.260584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.260630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.260644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.260665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.260678 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.269425 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.269612 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:19:07 crc kubenswrapper[4909]: E1002 18:19:07.269813 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:19:39.269786039 +0000 UTC m=+100.457281938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.362822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.362853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.362861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.362873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.362882 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.466078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.466153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.466174 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.466196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.466214 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.573280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.573321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.573330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.573366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.573379 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.676138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.676416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.676507 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.676593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.676671 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.780974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.781013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.781021 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.781054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.781063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.884465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.884521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.884532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.884549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.884558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.987156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.987209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.987219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.987238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:07 crc kubenswrapper[4909]: I1002 18:19:07.987250 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:07Z","lastTransitionTime":"2025-10-02T18:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.089739 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.089799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.089816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.089841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.089862 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.192784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.192847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.192857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.192876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.192890 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.295707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.295751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.295759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.295775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.295785 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.398272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.398335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.398358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.398387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.398410 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.500636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.500677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.500689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.500709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.500721 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.603338 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.603387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.603405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.603425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.603443 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.607638 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.607653 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.607689 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.607734 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:08 crc kubenswrapper[4909]: E1002 18:19:08.607862 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:08 crc kubenswrapper[4909]: E1002 18:19:08.607948 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:08 crc kubenswrapper[4909]: E1002 18:19:08.608114 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:08 crc kubenswrapper[4909]: E1002 18:19:08.608162 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.705987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.706052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.706065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.706086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.706097 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.808951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.808995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.809007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.809024 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.809052 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.910736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.910768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.910777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.910803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:08 crc kubenswrapper[4909]: I1002 18:19:08.910813 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:08Z","lastTransitionTime":"2025-10-02T18:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.014411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.014460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.014476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.014493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.014505 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.112366 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/0.log" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.112404 4909 generic.go:334] "Generic (PLEG): container finished" podID="c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e" containerID="b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd" exitCode=1 Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.112426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerDied","Data":"b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.112713 4909 scope.go:117] "RemoveContainer" containerID="b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.117058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.117104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.117117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.117137 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.117159 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.131640 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.148370 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.160748 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.172629 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.190398 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.208438 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.220416 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.221585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.221693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.221709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.221726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.221740 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.236330 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.251599 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.263349 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.275128 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.295702 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.309526 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.323136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.323275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.323338 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.323411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.323476 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.329837 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.344223 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.354978 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.370101 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.388010 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.427354 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.427405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.427416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.427433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.427446 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.529613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.529655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.529665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.529682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.529693 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.625104 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.638619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.638673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.638682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.638697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.638708 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.644968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.661664 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.673950 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.689694 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.703675 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.716370 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.731248 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.741641 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.741695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.741709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.741730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.741741 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.747209 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.758265 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.769979 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.782447 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.794615 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.810249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.841670 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.844596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.844623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.844632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.844647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.844657 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.856752 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.876301 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.893117 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:09Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.947696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.947771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.947786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.947805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:09 crc kubenswrapper[4909]: I1002 18:19:09.947818 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:09Z","lastTransitionTime":"2025-10-02T18:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.050002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.050072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.050088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.050110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.050126 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.117530 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/0.log" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.117623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerStarted","Data":"5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.135622 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.150218 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.152775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.152865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.152884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.152906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.152925 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.178945 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.193456 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.212594 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.230557 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.242663 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.253102 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.255510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.255575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.255589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.255609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.255622 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.269941 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.285213 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.303438 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.319666 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.337325 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.354135 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.358427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.358467 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.358479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.358497 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.358509 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.378338 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.399403 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.418552 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.434782 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:10Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.460782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.460843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.460857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.460878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.460890 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.563900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.563935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.563944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.563958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.563970 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.608370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.608381 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.608400 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.608515 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:10 crc kubenswrapper[4909]: E1002 18:19:10.608660 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:10 crc kubenswrapper[4909]: E1002 18:19:10.608777 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:10 crc kubenswrapper[4909]: E1002 18:19:10.608921 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:10 crc kubenswrapper[4909]: E1002 18:19:10.609017 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.666634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.666685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.666696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.666714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.666726 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.769557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.769610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.769627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.769651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.769669 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.871557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.871600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.871610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.871625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.871635 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.973872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.973916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.973925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.973941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:10 crc kubenswrapper[4909]: I1002 18:19:10.973950 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:10Z","lastTransitionTime":"2025-10-02T18:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.077459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.077538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.077554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.077574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.077587 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.182418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.182467 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.182483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.182510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.182527 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.285012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.285120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.285176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.285202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.285218 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.388884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.388958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.388977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.389002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.389020 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.492173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.492216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.492235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.492256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.492272 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.595287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.595334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.595345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.595361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.595373 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.698287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.698349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.698371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.698397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.698414 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.800830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.800866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.800874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.800887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.800897 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.904313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.904356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.904365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.904381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:11 crc kubenswrapper[4909]: I1002 18:19:11.904390 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:11Z","lastTransitionTime":"2025-10-02T18:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.007369 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.007404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.007413 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.007428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.007437 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.110111 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.110178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.110196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.110224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.110242 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.213612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.214612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.214837 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.215090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.215240 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.318350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.318384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.318393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.318405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.318412 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.421067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.421105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.421117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.421131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.421142 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.523198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.523255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.523327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.523352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.523408 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.608329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.608417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.608542 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:12 crc kubenswrapper[4909]: E1002 18:19:12.608821 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.608855 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:12 crc kubenswrapper[4909]: E1002 18:19:12.608988 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:12 crc kubenswrapper[4909]: E1002 18:19:12.609104 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:12 crc kubenswrapper[4909]: E1002 18:19:12.609373 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.626885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.626951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.626972 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.626994 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.627011 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.729386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.729419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.729429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.729439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.729450 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.831531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.831615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.831634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.831658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.831679 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.934476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.934527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.934538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.934558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:12 crc kubenswrapper[4909]: I1002 18:19:12.934570 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:12Z","lastTransitionTime":"2025-10-02T18:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.036907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.036950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.036958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.036973 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.036982 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.139291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.139360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.139382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.139411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.139434 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.241886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.241957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.241976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.242004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.242047 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.344983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.345051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.345060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.345080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.345092 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.448416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.448551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.448573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.448598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.448618 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.550956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.551003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.551048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.551068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.551081 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.654295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.654363 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.654380 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.654411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.654429 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.757741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.757803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.757817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.757839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.757854 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.860427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.860487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.860505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.860529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.860549 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.963247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.963304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.963322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.963348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:13 crc kubenswrapper[4909]: I1002 18:19:13.963366 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:13Z","lastTransitionTime":"2025-10-02T18:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.066115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.066245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.066263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.066287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.066307 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.168756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.168833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.168857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.168904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.168933 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.273327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.273399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.273417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.273442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.273461 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.377287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.377347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.377364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.377388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.377407 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.480359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.480429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.480452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.480485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.480508 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.583110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.583167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.583184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.583207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.583225 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.608164 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.608235 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.608274 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:14 crc kubenswrapper[4909]: E1002 18:19:14.608350 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.608356 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:14 crc kubenswrapper[4909]: E1002 18:19:14.608541 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:14 crc kubenswrapper[4909]: E1002 18:19:14.608698 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:14 crc kubenswrapper[4909]: E1002 18:19:14.608756 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.686148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.686191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.686200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.686216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.686226 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.789450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.789510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.789528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.789554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.789572 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.904344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.904448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.904459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.904474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:14 crc kubenswrapper[4909]: I1002 18:19:14.904483 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:14Z","lastTransitionTime":"2025-10-02T18:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.006718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.006771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.006791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.006815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.006834 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.110257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.110341 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.110366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.110399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.110421 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.215324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.215400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.215426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.215469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.215492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.318396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.318465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.318493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.318524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.318545 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.421516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.421588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.421611 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.421643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.421665 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.524472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.524607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.524631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.524661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.524681 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.609053 4909 scope.go:117] "RemoveContainer" containerID="5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.627254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.627335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.627350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.627370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.627383 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.730962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.731073 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.731099 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.731180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.731198 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.833426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.833456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.833466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.833482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.833494 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.937125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.937187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.937204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.937228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:15 crc kubenswrapper[4909]: I1002 18:19:15.937246 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:15Z","lastTransitionTime":"2025-10-02T18:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.041349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.041393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.041404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.041420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.041431 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.143127 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/2.log" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.143602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.143660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.143678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.143702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.143719 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.149094 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.149826 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.170172 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.189473 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.234584 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.246988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.247045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.247053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.247069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.247080 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.254450 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.270325 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.291835 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.309518 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.326229 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.343743 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.349500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.349540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.349553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.349571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.349585 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.373968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.397203 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.421519 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.440439 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.451385 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.451819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.451871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.451884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.451904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.451917 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.462975 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.474965 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.487241 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.499944 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.554718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.554787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.554797 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.554810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.554820 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.608158 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.608175 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:16 crc kubenswrapper[4909]: E1002 18:19:16.608405 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.608212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:16 crc kubenswrapper[4909]: E1002 18:19:16.608454 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.608168 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:16 crc kubenswrapper[4909]: E1002 18:19:16.608522 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:16 crc kubenswrapper[4909]: E1002 18:19:16.608720 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.658224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.658280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.658300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.658323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.658341 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.761622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.761668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.761682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.761702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.761715 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.864677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.864725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.864737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.864754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.864766 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.967567 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.967615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.967632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.967656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:16 crc kubenswrapper[4909]: I1002 18:19:16.967673 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:16Z","lastTransitionTime":"2025-10-02T18:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.071214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.071266 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.071283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.071308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.071349 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.156566 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/3.log" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.157605 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/2.log" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.162442 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" exitCode=1 Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.162511 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.162571 4909 scope.go:117] "RemoveContainer" containerID="5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.164827 4909 scope.go:117] "RemoveContainer" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.165274 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.173815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.173870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.173886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.173910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.173928 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.185062 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.204375 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.220178 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.239097 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.256396 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.273656 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.277368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.277427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.277443 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.277466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.277481 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.291752 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.313215 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.328154 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.352827 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c09b484ffc44530bdb2194bc415c5b644a215662167cd0ddf8b7180816298b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"message\\\":\\\"nat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 18:18:48.547332 6575 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"5b85277d-d9b7-4a68-8e4e-2b80594d9347\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:16Z\\\",\\\"message\\\":\\\"s: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 18:19:16.696413 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z]\\\\nI1002 18:19:16.696456 6932 model_client.go:398] Mutate operations generated as: [{Op:mu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.368461 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.379963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.380020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.380063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.380088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.380231 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.389892 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.403050 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.416641 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.432384 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.445630 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.460562 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.478574 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.482661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.482739 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.482794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.482826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.482848 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.540833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.541265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.541402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.541486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.541545 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.560888 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.566054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.566235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.566323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.566436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.566537 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.581880 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.586672 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.586733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.586752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.586776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.586796 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.599512 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.603677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.603734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.603751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.603775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.603791 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.618082 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.623554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.623596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.623610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.623631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.623647 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.639198 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:17Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:17 crc kubenswrapper[4909]: E1002 18:19:17.639313 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.641294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.641365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.641386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.641418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.641442 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.744871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.744942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.744960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.744990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.745009 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.848332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.848399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.848415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.848442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.848460 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.951436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.951495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.951525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.951555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:17 crc kubenswrapper[4909]: I1002 18:19:17.951573 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:17Z","lastTransitionTime":"2025-10-02T18:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.062371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.062478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.062500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.062528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.062547 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.165480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.165536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.165554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.165579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.165598 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.168641 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/3.log" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.174156 4909 scope.go:117] "RemoveContainer" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" Oct 02 18:19:18 crc kubenswrapper[4909]: E1002 18:19:18.174350 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.191891 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.208040 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.223768 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.240873 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.262370 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.268521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.268591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.268603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.268622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.268661 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.276334 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.290351 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.308954 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.323454 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.348102 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.367216 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.372118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.372269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.372372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.372493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.372595 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.385410 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.403671 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.434249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:16Z\\\",\\\"message\\\":\\\"s: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 18:19:16.696413 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z]\\\\nI1002 18:19:16.696456 6932 model_client.go:398] Mutate operations generated as: [{Op:mu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:19:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.454252 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.475372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.475412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.475421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.475441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.475453 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.476625 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.491706 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.507919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:18Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.579237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.579292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.579311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.579333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.579350 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.608287 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.608344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.608404 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.608406 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:18 crc kubenswrapper[4909]: E1002 18:19:18.608490 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:18 crc kubenswrapper[4909]: E1002 18:19:18.608561 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:18 crc kubenswrapper[4909]: E1002 18:19:18.608585 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:18 crc kubenswrapper[4909]: E1002 18:19:18.608644 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.682542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.682605 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.682622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.682648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.682664 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.785679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.785732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.785742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.785785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.785797 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.888537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.888584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.888596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.888616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.888626 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.992176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.992246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.992263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.992289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:18 crc kubenswrapper[4909]: I1002 18:19:18.992304 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:18Z","lastTransitionTime":"2025-10-02T18:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.095833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.095910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.095932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.095961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.095981 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.199413 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.199474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.199488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.199512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.199542 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.308568 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.308628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.308651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.308675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.308693 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.412104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.412148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.412160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.412181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.412194 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.514600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.514691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.514701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.514725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.514736 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.618012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.618096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.618115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.618144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.618163 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.629233 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.651531 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.665958 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.679203 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.696342 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.714524 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.720085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.720132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.720143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.720160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.720175 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.731469 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.749188 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.768116 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.789796 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.804563 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.820368 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.823168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.823203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.823214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.823234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.823252 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.832843 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.847618 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.864001 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.880101 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.898774 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.926058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.926298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.926465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.926628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.926765 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:19Z","lastTransitionTime":"2025-10-02T18:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:19 crc kubenswrapper[4909]: I1002 18:19:19.931246 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:16Z\\\",\\\"message\\\":\\\"s: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 18:19:16.696413 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z]\\\\nI1002 18:19:16.696456 6932 model_client.go:398] Mutate operations generated as: [{Op:mu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:19:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:19Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.029946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.030010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.030053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.030081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.030100 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.134147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.134208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.134227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.134253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.134271 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.237861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.237927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.237944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.237970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.237988 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.340856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.341239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.341453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.341645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.341776 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.445301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.445385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.445408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.445440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.445466 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.550784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.550850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.550868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.550892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.550912 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.607381 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.607514 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.607381 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:20 crc kubenswrapper[4909]: E1002 18:19:20.607597 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.607625 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:20 crc kubenswrapper[4909]: E1002 18:19:20.607764 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:20 crc kubenswrapper[4909]: E1002 18:19:20.607924 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:20 crc kubenswrapper[4909]: E1002 18:19:20.608126 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.653959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.654008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.654059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.654083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.654102 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.757646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.757704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.757727 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.757756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.757778 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.860915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.860974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.860993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.861017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.861062 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.964194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.964277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.964297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.964321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:20 crc kubenswrapper[4909]: I1002 18:19:20.964338 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:20Z","lastTransitionTime":"2025-10-02T18:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.066735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.066783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.066800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.066823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.066842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.169238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.169297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.169319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.169348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.169371 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.273064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.273120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.273143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.273170 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.273192 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.376407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.376473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.376490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.376515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.376534 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.479223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.479290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.479314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.479342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.479365 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.582700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.582758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.582781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.582815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.582836 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.685382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.685439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.685455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.685479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.685497 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.788120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.788149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.788180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.788194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.788204 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.891361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.891434 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.891461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.891510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.891532 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.996175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.996251 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.996277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.996306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:21 crc kubenswrapper[4909]: I1002 18:19:21.996329 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:21Z","lastTransitionTime":"2025-10-02T18:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.099921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.099992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.100020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.100122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.100143 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.202933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.203016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.203086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.203119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.203139 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.306014 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.306132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.306158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.306184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.306202 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.409653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.409775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.409799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.409831 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.409856 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.513694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.513761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.513870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.513904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.513925 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.608309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.608431 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:22 crc kubenswrapper[4909]: E1002 18:19:22.608497 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.608547 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.608776 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:22 crc kubenswrapper[4909]: E1002 18:19:22.608955 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:22 crc kubenswrapper[4909]: E1002 18:19:22.609007 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:22 crc kubenswrapper[4909]: E1002 18:19:22.609112 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.618180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.618266 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.618286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.618312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.618330 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.721956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.722048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.722060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.722080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.722093 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.825387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.825428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.825440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.825453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.825465 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.928814 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.928886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.928903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.928929 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:22 crc kubenswrapper[4909]: I1002 18:19:22.928953 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:22Z","lastTransitionTime":"2025-10-02T18:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.032224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.032299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.032312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.032334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.032349 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.135393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.135451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.135463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.135484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.135498 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.238825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.238902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.238921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.238943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.238960 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.342444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.342499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.342509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.342526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.342537 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.445954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.446005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.446044 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.446064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.446076 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.457383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.457490 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.457664 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.457634339 +0000 UTC m=+148.645130208 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.457754 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.457809 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.457796844 +0000 UTC m=+148.645292813 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.548884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.548946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.548963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.548987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.549005 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.558466 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.558577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.558615 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558618 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558763 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.558717261 +0000 UTC m=+148.746213150 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558784 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558814 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558791 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558861 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558882 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558827 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558937 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.558919507 +0000 UTC m=+148.746415396 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:19:23 crc kubenswrapper[4909]: E1002 18:19:23.558966 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.558953698 +0000 UTC m=+148.746449597 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.652010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.652112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.652129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.652154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.652171 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.755880 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.755945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.755963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.755992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.756010 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.858366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.858427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.858444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.858467 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.858485 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.961070 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.961126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.961141 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.961167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:23 crc kubenswrapper[4909]: I1002 18:19:23.961185 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:23Z","lastTransitionTime":"2025-10-02T18:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.064377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.064459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.064492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.064527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.064549 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.167468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.167530 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.167555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.167587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.167610 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.270299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.270359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.270378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.270402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.270422 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.373430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.373483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.373499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.373521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.373541 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.476128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.476184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.476203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.476227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.476243 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.578757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.578820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.578842 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.578870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.578892 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.607825 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.607965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.608071 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:24 crc kubenswrapper[4909]: E1002 18:19:24.608104 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.608132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:24 crc kubenswrapper[4909]: E1002 18:19:24.608265 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:24 crc kubenswrapper[4909]: E1002 18:19:24.608412 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:24 crc kubenswrapper[4909]: E1002 18:19:24.608563 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.682290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.682350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.682366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.682388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.682406 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.785733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.785783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.785800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.785824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.785842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.888623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.888683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.888700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.888721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.888737 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.991915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.992022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.992107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.992139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:24 crc kubenswrapper[4909]: I1002 18:19:24.992162 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:24Z","lastTransitionTime":"2025-10-02T18:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.095424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.095482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.095499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.095524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.095543 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.198121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.198171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.198187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.198210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.198228 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.301237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.301305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.301323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.301348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.301368 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.406466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.406517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.406533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.406570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.406587 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.509701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.509757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.509778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.509802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.509820 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.612829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.612903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.612928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.612961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.612986 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.717277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.717343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.717359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.717384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.717402 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.820172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.820237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.820255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.820280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.820297 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.923326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.923385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.923403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.923428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:25 crc kubenswrapper[4909]: I1002 18:19:25.923445 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:25Z","lastTransitionTime":"2025-10-02T18:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.027330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.027425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.027444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.027503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.027523 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.130478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.130540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.130558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.130659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.130681 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.233892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.233965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.233991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.234019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.234074 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.338219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.338290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.338313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.338343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.338367 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.441498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.441596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.441615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.441699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.441721 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.545461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.545512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.545531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.545556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.545574 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.608233 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.608316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:26 crc kubenswrapper[4909]: E1002 18:19:26.608457 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.608497 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.608533 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:26 crc kubenswrapper[4909]: E1002 18:19:26.608659 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:26 crc kubenswrapper[4909]: E1002 18:19:26.608741 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:26 crc kubenswrapper[4909]: E1002 18:19:26.609416 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.628505 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.649204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.649253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.649296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.649317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.649333 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.753180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.753265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.753288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.753317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.753442 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.857009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.857100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.857119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.857146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.857163 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.960598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.960654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.960673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.960698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:26 crc kubenswrapper[4909]: I1002 18:19:26.960716 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:26Z","lastTransitionTime":"2025-10-02T18:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.063725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.063793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.063817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.063845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.063864 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.167393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.167456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.167478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.167507 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.167527 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.270792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.270860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.270879 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.270905 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.270922 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.373728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.373829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.373855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.373886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.373907 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.476735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.477139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.477297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.477452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.477597 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.580821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.581145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.581244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.581337 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.581431 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.685403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.685490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.685517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.685547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.685586 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.788948 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.789008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.789060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.789093 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.789115 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.892427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.892525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.892544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.892566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.892585 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.966772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.966823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.966836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.966857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.966869 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:27 crc kubenswrapper[4909]: E1002 18:19:27.986557 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:27Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.993892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.993930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.993940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.993957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:27 crc kubenswrapper[4909]: I1002 18:19:27.993968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:27Z","lastTransitionTime":"2025-10-02T18:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.009197 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.013607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.013637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.013647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.013662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.013672 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.028900 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.035445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.035495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.035508 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.035552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.035565 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.050307 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.054928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.055196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.055350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.055500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.055621 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.072526 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"de99d26b-a0b1-4b58-8c7b-bc3a5dd3973d\\\",\\\"systemUUID\\\":\\\"7c418406-42b5-4f83-a45d-1cef2c7c1a53\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:28Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.073014 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.074914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.075076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.075192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.075294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.075423 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.178414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.178464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.178474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.178495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.178507 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.281334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.281524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.281561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.281593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.281615 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.384390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.384430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.384438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.384451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.384461 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.486897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.487447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.487466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.487494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.487514 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.590127 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.590171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.590181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.590199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.590211 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.607781 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.607819 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.607857 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.607891 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.607995 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.608058 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.608418 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:28 crc kubenswrapper[4909]: E1002 18:19:28.608558 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.693364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.693424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.693441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.693465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.693484 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.796930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.796988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.797003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.797053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.797073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.900150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.900220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.900235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.900257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:28 crc kubenswrapper[4909]: I1002 18:19:28.900273 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:28Z","lastTransitionTime":"2025-10-02T18:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.004234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.004332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.004359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.004394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.004418 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.107805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.107866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.107883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.107906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.107923 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.211099 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.211162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.211184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.211210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.211227 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.315404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.315471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.315490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.315515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.315533 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.419207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.419330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.419349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.419374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.419391 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.522817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.522912 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.522935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.522964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.522986 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.626420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.626732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.626938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.627119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.627470 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.638751 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:16Z\\\",\\\"message\\\":\\\"s: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1002 18:19:16.696413 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:16Z is after 2025-08-24T17:21:41Z]\\\\nI1002 18:19:16.696456 6932 model_client.go:398] Mutate operations generated as: [{Op:mu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:19:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qp72d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4scf8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.654426 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0e6b38-67c8-444e-95ab-df2d244f7d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec2765f999c437ee4110520824d2791df134529e4509112e42be18f5f1a8f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1592a140cc2f9bb539fb1e83ab7a120b64a109d4a686a8202cc9cd219b6d493c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.675783 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"216b6a0f-472f-4608-9866-de5f4a133f02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78ce09045f4e0651a8197b3552cbffce4c43b005fcff781a96cd38874c7cea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae0de7354e8301d189e7a8b220cc26cf80e707b5fe786458024e62dbf45c834\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08a5e94aec8c1457f0dfa93f6f303a7fad02531c82f427754b3038faea9408b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://644e290fdb8020fcf64c42da43b7d9d2fefb302a2ed45edd7f4fbfcc123552f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bece95b8a67f35621603f4ba555103ac6a567f1353dda014484d5e804816845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"message\\\":\\\"W1002 18:18:02.870163 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 18:18:02.870549 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759429082 cert, and key in /tmp/serving-cert-486046687/serving-signer.crt, /tmp/serving-cert-486046687/serving-signer.key\\\\nI1002 18:18:03.187161 1 observer_polling.go:159] Starting file observer\\\\nW1002 18:18:03.194240 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 18:18:03.194433 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 18:18:03.197626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-486046687/tls.crt::/tmp/serving-cert-486046687/tls.key\\\\\\\"\\\\nF1002 18:18:03.357897 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f28c6443329be11c2023f1fec50dba2da9b9d1ded58c0ae308353745c97694\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e21f06ca73e48069d9cfdde12e587ccec53a62ed2656457e4b61e5623eedb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.696824 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229ea3a19e1fbdc2f87e9f5d03d430ff695909129f0a6723054bdbe735974af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.718553 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.730127 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.730195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.730212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.730239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.730256 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.738649 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31958374-7b04-45be-9509-c51e08f9afe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd9db3fb951629bf357140196d98decb86eadb4edc4dc509ae3bd473fffd6e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsw7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4777h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.759117 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc3f3eabfcb877e76e0c7ba53054e6cfe811b91a56f7f13d91c104aa1d7fac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20810661c52500c0321e41e194e8444e5176850026250d4c70eed5d069707cbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.777540 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"100e4154-9795-41ea-8365-38ab076e57cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a20e6b9dd0b5e9c22a6f1a1bacb03094ec224cd8cdce5d45302b70f7b4bf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241e40dc1a39eda3fd0edcaec1d96293004c3426ecc38d2da6b28125c45c1e6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4f7d64f41b6ece3df3c7e02987d5eace9230c5d05ac7bcf6b782626fe4e9806\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d20f48a5c376a97b17745c00c72d65c500fffc20a47fca200deffb3035cee3f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1751eaddacea050808a7025551ba7f529ff60a9cb1d1821c122f9afd4b5e2c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a943a430ce34608bf8f7819afdcdbb11c3a9da5fd464b65cdb20b80a1339a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4d8107ffe4d7bfd4b9b5f3d83c76bcf2460ed76e545dc4aa594606c65de085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnbw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6lnlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.791879 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c51cad9b-1ae8-4c5d-bfd8-dfc178843056\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abacb054a9c3e40b7b5200dc543e517008f64b85e8c1b79d993612815f8f5c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpsc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.807490 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wd57x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccda21fa-5211-460d-b521-fc5c86673b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9n9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wd57x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.832829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.832894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.832926 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.832951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.832968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.842058 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad337db8-82e1-4dcf-92d0-22306a419be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5efed0c035a70f9cafb36e0314b45dc2ef2c0b190405f2d29aebb90bbf0076e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3294328b2336d9690415ac34b8143c7c0eae10ea727139ee938954c91897c536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://829f91c01ec24d0b7cf680494ba4588058de95d0ed144da2f718b5534ea9394d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a67a53d782ae1009e7016ff28977fd72aaa0d3083c4f0ba47e48b9bc90c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eadf1786888de2315cad908fcbc7cc00876ae9daa9c886a2b17c39a804bd3683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76976bc4dab27c2bc9e8cb12293877f800b24d6208dc2a1fc06cf1fc679d6692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76976bc4dab27c2bc9e8cb12293877f800b24d6208dc2a1fc06cf1fc679d6692\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88865d8c3af16ebeebe7c058415570d789fea42749177b535120e7edc05a670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e88865d8c3af16ebeebe7c058415570d789fea42749177b535120e7edc05a670\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d58f3537f48f50ff2f6359e7f7307b283fd3e69a1b3c9d6fd92e9beb93cd795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d58f3537f48f50ff2f6359e7f7307b283fd3e69a1b3c9d6fd92e9beb93cd795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.859452 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd5e2ff-545d-4b6d-ab15-1dede4bdf0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57565cdc41dcc38f63aaf9e865efd473ffd66c9c0e87ddc1b7e138cb3a9087d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1520073e7d2c3330a951a962494635c87e3bbbd37c2493bb040b7677c74d3918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ecc73c5625ab4c2a426e1c854818a10ff6d2f56741c1cf5ed51d8c24f747ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5baa0eab3af838ecb08a8f8953e806c1e2ae4443404f9aae4f6573f08bedc2ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.875361 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baeded3-9da9-48c8-baa9-62c5dac07f87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11d87baa0c240be284e7c8ee38880e6265ab7daadfcd21cf2c26c8f4fc0db943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92528eef2b4ca71d8fbc0bba5df2b54dca0eb5e7e8d94d14c649006c86e6940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9f8d5a619c4e18399f7cb61e508dac47db1617dd33c9067a23c1ee07c5c4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://960fe9f72c59e9742cc9774f447899abd2c665823e55577e3f9229e82a7a732b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T18:18:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:17:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.887806 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.901199 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"115da42c-b1e9-470d-8734-a3331cdff421\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7b879d406ac9a8d1b52ecf4fcd919b5783bd026926e68335ef618844560e412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090ac2b83ed0b5dbd5d9390b676b6a6fc525e232bad5e44c0e16169f68cd50ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m25r4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tlwl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.921480 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.935291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.935422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.935494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.935525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.935554 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:29Z","lastTransitionTime":"2025-10-02T18:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.939440 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8cecf2af41854a1313fe27481f1abaa2bb71e686f6515b1b4783a77dcab575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.955635 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kjgfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2951bc3-5e48-4af4-b5ac-7d7c74aad1fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc5f708908dbff5248014754a244ef4b6a2c706aa2f736c70726c9fb64dff30c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxx72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kjgfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:29 crc kubenswrapper[4909]: I1002 18:19:29.972822 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7gpnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T18:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T18:19:08Z\\\",\\\"message\\\":\\\"2025-10-02T18:18:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7\\\\n2025-10-02T18:18:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6ad14ed-73a3-49b9-97db-adbf4a351dd7 to /host/opt/cni/bin/\\\\n2025-10-02T18:18:23Z [verbose] multus-daemon started\\\\n2025-10-02T18:18:23Z [verbose] Readiness Indicator file check\\\\n2025-10-02T18:19:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T18:18:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T18:19:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6xt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T18:18:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7gpnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T18:19:29Z is after 2025-08-24T17:21:41Z" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.039098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.039164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.039184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.039208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.039227 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.141766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.141817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.141833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.141851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.141867 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.244841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.244921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.244939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.244967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.244985 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.347706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.347772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.347791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.347818 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.347835 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.450458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.450516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.450532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.450555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.450574 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.554233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.554309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.554330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.554354 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.554372 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.608184 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.608263 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.608292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.608271 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:30 crc kubenswrapper[4909]: E1002 18:19:30.608424 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:30 crc kubenswrapper[4909]: E1002 18:19:30.608571 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:30 crc kubenswrapper[4909]: E1002 18:19:30.608658 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:30 crc kubenswrapper[4909]: E1002 18:19:30.608824 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.657477 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.657552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.657575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.657607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.657628 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.761450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.761503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.761520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.761542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.761559 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.865081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.865178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.865200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.865225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.865242 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.968623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.968678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.968695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.968713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:30 crc kubenswrapper[4909]: I1002 18:19:30.968724 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:30Z","lastTransitionTime":"2025-10-02T18:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.071643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.071740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.071769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.071808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.071841 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.174874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.175040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.175066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.175093 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.175110 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.277768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.277829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.277838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.277855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.277867 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.381505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.381574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.381597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.381622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.381639 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.485270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.485335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.485351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.485376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.485393 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.588609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.588673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.588691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.588718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.588741 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.692883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.692975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.692999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.693069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.693094 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.797372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.797422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.797438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.797524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.797543 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.900828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.900902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.900927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.900957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:31 crc kubenswrapper[4909]: I1002 18:19:31.900983 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:31Z","lastTransitionTime":"2025-10-02T18:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.004348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.004414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.004433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.004457 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.004471 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.110950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.111105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.111140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.111184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.111208 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.214616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.214684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.214700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.214732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.214750 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.317790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.317851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.317872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.317897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.317915 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.421790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.421838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.421848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.421865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.421875 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.524968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.525085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.525112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.525147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.525170 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.608258 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.608378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:32 crc kubenswrapper[4909]: E1002 18:19:32.608477 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.608561 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.608572 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:32 crc kubenswrapper[4909]: E1002 18:19:32.608744 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:32 crc kubenswrapper[4909]: E1002 18:19:32.608859 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:32 crc kubenswrapper[4909]: E1002 18:19:32.609440 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.609947 4909 scope.go:117] "RemoveContainer" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" Oct 02 18:19:32 crc kubenswrapper[4909]: E1002 18:19:32.610232 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.629333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.629409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.629427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.629450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.629468 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.738000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.738144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.738311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.738396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.738420 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.842427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.842478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.842496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.842520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.842538 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.946348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.946423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.946445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.946476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:32 crc kubenswrapper[4909]: I1002 18:19:32.946498 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:32Z","lastTransitionTime":"2025-10-02T18:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.050571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.050628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.050647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.050671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.050688 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.154633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.154741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.154757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.154783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.154799 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.257340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.257400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.257413 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.257433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.257445 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.359527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.359588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.359600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.359623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.359637 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.463230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.463328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.463353 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.463383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.463406 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.566438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.566510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.566536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.566563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.566584 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.669657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.669743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.669770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.669799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.669820 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.773485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.773546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.773562 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.773586 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.773602 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.877273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.877442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.877472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.877553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.877626 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.981012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.981100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.981115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.981138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:33 crc kubenswrapper[4909]: I1002 18:19:33.981150 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:33Z","lastTransitionTime":"2025-10-02T18:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.084309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.084402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.084427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.084459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.084489 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.186965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.187044 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.187063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.187087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.187102 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.290553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.290666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.290688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.290713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.290733 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.393830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.393913 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.393934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.393966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.393985 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.497142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.497218 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.497233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.497259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.497280 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.600126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.600163 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.600171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.600184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.600193 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.607877 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.607930 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.607885 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:34 crc kubenswrapper[4909]: E1002 18:19:34.607970 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.608005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:34 crc kubenswrapper[4909]: E1002 18:19:34.608137 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:34 crc kubenswrapper[4909]: E1002 18:19:34.608211 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:34 crc kubenswrapper[4909]: E1002 18:19:34.608236 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.703183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.703267 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.703290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.703322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.703346 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.806203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.806277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.806295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.806339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.806356 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.908804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.908886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.908911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.908942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:34 crc kubenswrapper[4909]: I1002 18:19:34.908965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:34Z","lastTransitionTime":"2025-10-02T18:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.011421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.011492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.011511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.011538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.011556 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.115193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.115267 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.115285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.115313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.115336 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.218219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.218295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.218307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.218330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.218343 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.322282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.322355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.322374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.322399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.322418 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.425735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.426447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.426536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.426633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.426708 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.529250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.529718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.529871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.530059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.530217 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.632695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.632752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.632768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.632789 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.632808 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.735946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.736134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.736212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.736246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.736314 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.839752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.839840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.839866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.839906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.839930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.943498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.943581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.943602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.943628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:35 crc kubenswrapper[4909]: I1002 18:19:35.943647 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:35Z","lastTransitionTime":"2025-10-02T18:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.046715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.046784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.046801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.046829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.046847 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.150100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.150178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.150201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.150231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.150253 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.252777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.252853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.252877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.252910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.252932 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.355576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.355622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.355633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.355649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.355660 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.458537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.458599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.458619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.458644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.458662 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.561077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.561147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.561164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.561188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.561218 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.608180 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.608199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.608439 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.608487 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:36 crc kubenswrapper[4909]: E1002 18:19:36.608756 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:36 crc kubenswrapper[4909]: E1002 18:19:36.608829 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:36 crc kubenswrapper[4909]: E1002 18:19:36.608939 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:36 crc kubenswrapper[4909]: E1002 18:19:36.609065 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.664427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.664499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.664520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.664545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.664563 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.767832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.767899 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.767922 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.767952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.767975 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.870734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.870804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.870822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.870848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.870868 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.973671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.973723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.973740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.973762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:36 crc kubenswrapper[4909]: I1002 18:19:36.973780 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:36Z","lastTransitionTime":"2025-10-02T18:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.076680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.076741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.076758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.076781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.076798 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.179939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.180009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.180054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.180083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.180107 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.283875 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.283935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.283952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.283976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.283994 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.387276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.387330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.387349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.387379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.387448 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.490648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.490699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.490714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.490735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.490749 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.594097 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.594153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.594171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.594197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.594215 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.697157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.697227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.697245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.697273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.697291 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.800598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.800659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.800676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.800699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.800718 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.904266 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.904335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.904358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.904387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:37 crc kubenswrapper[4909]: I1002 18:19:37.904409 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:37Z","lastTransitionTime":"2025-10-02T18:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.007493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.007565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.007590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.007621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.007644 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:38Z","lastTransitionTime":"2025-10-02T18:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.111420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.111507 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.111543 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.111572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.111597 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:38Z","lastTransitionTime":"2025-10-02T18:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.159855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.159966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.159989 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.160018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.160074 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T18:19:38Z","lastTransitionTime":"2025-10-02T18:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.222382 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc"] Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.223128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.225823 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.225974 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.226558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.226569 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.227855 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.227928 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.227962 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.228014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.228114 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.274420 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6lnlx" podStartSLOduration=77.274388776 podStartE2EDuration="1m17.274388776s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.261704566 +0000 UTC m=+99.449200475" watchObservedRunningTime="2025-10-02 18:19:38.274388776 +0000 UTC m=+99.461884685" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.287530 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9j4dc" podStartSLOduration=77.287513611 podStartE2EDuration="1m17.287513611s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.275093618 +0000 UTC m=+99.462589487" watchObservedRunningTime="2025-10-02 18:19:38.287513611 +0000 UTC m=+99.475009470" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.323154 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.323131957 podStartE2EDuration="1m18.323131957s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.322365743 +0000 UTC m=+99.509861622" watchObservedRunningTime="2025-10-02 18:19:38.323131957 +0000 UTC m=+99.510627846" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.323366 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.323358934 podStartE2EDuration="12.323358934s" podCreationTimestamp="2025-10-02 18:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.309150947 +0000 UTC m=+99.496646826" watchObservedRunningTime="2025-10-02 18:19:38.323358934 +0000 UTC m=+99.510854833" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328430 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328467 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328552 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.328806 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.329567 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.336275 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.345367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69ecfbc5-5db0-4542-9aee-ffa0678bf73e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7rlkc\" (UID: \"69ecfbc5-5db0-4542-9aee-ffa0678bf73e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.355311 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.355291297 podStartE2EDuration="50.355291297s" podCreationTimestamp="2025-10-02 18:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.336104667 +0000 UTC m=+99.523600576" watchObservedRunningTime="2025-10-02 18:19:38.355291297 +0000 UTC m=+99.542787166" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.399980 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tlwl" podStartSLOduration=76.399952801 podStartE2EDuration="1m16.399952801s" podCreationTimestamp="2025-10-02 18:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.380231805 +0000 UTC m=+99.567727674" watchObservedRunningTime="2025-10-02 18:19:38.399952801 +0000 UTC m=+99.587448670" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.427450 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kjgfb" podStartSLOduration=78.427429907 podStartE2EDuration="1m18.427429907s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.4268621 +0000 UTC m=+99.614357969" watchObservedRunningTime="2025-10-02 18:19:38.427429907 +0000 UTC m=+99.614925776" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.457361 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7gpnt" podStartSLOduration=78.457341818 podStartE2EDuration="1m18.457341818s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.445556776 +0000 UTC m=+99.633052645" watchObservedRunningTime="2025-10-02 18:19:38.457341818 +0000 UTC m=+99.644837677" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.496833 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=34.496804963 podStartE2EDuration="34.496804963s" podCreationTimestamp="2025-10-02 18:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.49636629 +0000 UTC m=+99.683862159" watchObservedRunningTime="2025-10-02 18:19:38.496804963 +0000 UTC m=+99.684300842" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.535891 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.535864506 podStartE2EDuration="1m19.535864506s" podCreationTimestamp="2025-10-02 18:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.514129917 +0000 UTC m=+99.701625816" watchObservedRunningTime="2025-10-02 18:19:38.535864506 +0000 UTC m=+99.723360405" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.540406 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.607303 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:38 crc kubenswrapper[4909]: E1002 18:19:38.607657 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.607823 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:38 crc kubenswrapper[4909]: E1002 18:19:38.607871 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.607967 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:38 crc kubenswrapper[4909]: E1002 18:19:38.608008 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:38 crc kubenswrapper[4909]: I1002 18:19:38.608151 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:38 crc kubenswrapper[4909]: E1002 18:19:38.608201 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:39 crc kubenswrapper[4909]: I1002 18:19:39.257879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" event={"ID":"69ecfbc5-5db0-4542-9aee-ffa0678bf73e","Type":"ContainerStarted","Data":"d2dd33a6e67cafed53a4baac5b54b664b768dfcd3717c29e2d69a1a453f78260"} Oct 02 18:19:39 crc kubenswrapper[4909]: I1002 18:19:39.257976 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" event={"ID":"69ecfbc5-5db0-4542-9aee-ffa0678bf73e","Type":"ContainerStarted","Data":"a65363ac99d2f8d2d3d7ec49e259f7076dabfe5427e304e58fab5ce93218ecf1"} Oct 02 18:19:39 crc kubenswrapper[4909]: I1002 18:19:39.277333 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podStartSLOduration=79.277301612 podStartE2EDuration="1m19.277301612s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:38.577315002 +0000 UTC m=+99.764810861" watchObservedRunningTime="2025-10-02 18:19:39.277301612 +0000 UTC m=+100.464797501" Oct 02 18:19:39 crc kubenswrapper[4909]: I1002 18:19:39.277692 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7rlkc" podStartSLOduration=79.277679763 podStartE2EDuration="1m19.277679763s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:39.277220239 +0000 UTC m=+100.464716178" watchObservedRunningTime="2025-10-02 18:19:39.277679763 +0000 UTC m=+100.465175662" Oct 02 18:19:39 crc kubenswrapper[4909]: I1002 18:19:39.337667 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:39 crc kubenswrapper[4909]: E1002 18:19:39.337901 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:19:39 crc kubenswrapper[4909]: E1002 18:19:39.338221 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs podName:ccda21fa-5211-460d-b521-fc5c86673b73 nodeName:}" failed. No retries permitted until 2025-10-02 18:20:43.338195126 +0000 UTC m=+164.525690995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs") pod "network-metrics-daemon-wd57x" (UID: "ccda21fa-5211-460d-b521-fc5c86673b73") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 18:19:40 crc kubenswrapper[4909]: I1002 18:19:40.608201 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:40 crc kubenswrapper[4909]: I1002 18:19:40.608356 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:40 crc kubenswrapper[4909]: E1002 18:19:40.608488 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:40 crc kubenswrapper[4909]: I1002 18:19:40.608561 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:40 crc kubenswrapper[4909]: I1002 18:19:40.608624 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:40 crc kubenswrapper[4909]: E1002 18:19:40.608830 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:40 crc kubenswrapper[4909]: E1002 18:19:40.608933 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:40 crc kubenswrapper[4909]: E1002 18:19:40.609137 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:42 crc kubenswrapper[4909]: I1002 18:19:42.608715 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:42 crc kubenswrapper[4909]: I1002 18:19:42.608788 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:42 crc kubenswrapper[4909]: I1002 18:19:42.608866 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:42 crc kubenswrapper[4909]: I1002 18:19:42.608905 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:42 crc kubenswrapper[4909]: E1002 18:19:42.609148 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:42 crc kubenswrapper[4909]: E1002 18:19:42.609360 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:42 crc kubenswrapper[4909]: E1002 18:19:42.609537 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:42 crc kubenswrapper[4909]: E1002 18:19:42.609692 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:44 crc kubenswrapper[4909]: I1002 18:19:44.608005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:44 crc kubenswrapper[4909]: E1002 18:19:44.608259 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:44 crc kubenswrapper[4909]: I1002 18:19:44.608082 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:44 crc kubenswrapper[4909]: E1002 18:19:44.608364 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:44 crc kubenswrapper[4909]: I1002 18:19:44.608092 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:44 crc kubenswrapper[4909]: E1002 18:19:44.608431 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:44 crc kubenswrapper[4909]: I1002 18:19:44.608054 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:44 crc kubenswrapper[4909]: E1002 18:19:44.608481 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:45 crc kubenswrapper[4909]: I1002 18:19:45.608924 4909 scope.go:117] "RemoveContainer" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" Oct 02 18:19:45 crc kubenswrapper[4909]: E1002 18:19:45.609254 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4scf8_openshift-ovn-kubernetes(4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" Oct 02 18:19:46 crc kubenswrapper[4909]: I1002 18:19:46.608019 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:46 crc kubenswrapper[4909]: I1002 18:19:46.608086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:46 crc kubenswrapper[4909]: I1002 18:19:46.608143 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:46 crc kubenswrapper[4909]: I1002 18:19:46.608143 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:46 crc kubenswrapper[4909]: E1002 18:19:46.608225 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:46 crc kubenswrapper[4909]: E1002 18:19:46.608403 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:46 crc kubenswrapper[4909]: E1002 18:19:46.608471 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:46 crc kubenswrapper[4909]: E1002 18:19:46.608573 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:48 crc kubenswrapper[4909]: I1002 18:19:48.607469 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:48 crc kubenswrapper[4909]: I1002 18:19:48.607537 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:48 crc kubenswrapper[4909]: I1002 18:19:48.607568 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:48 crc kubenswrapper[4909]: E1002 18:19:48.607645 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:48 crc kubenswrapper[4909]: I1002 18:19:48.607481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:48 crc kubenswrapper[4909]: E1002 18:19:48.607772 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:48 crc kubenswrapper[4909]: E1002 18:19:48.608072 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:48 crc kubenswrapper[4909]: E1002 18:19:48.608239 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:50 crc kubenswrapper[4909]: I1002 18:19:50.607689 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:50 crc kubenswrapper[4909]: E1002 18:19:50.607880 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:50 crc kubenswrapper[4909]: I1002 18:19:50.608264 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:50 crc kubenswrapper[4909]: I1002 18:19:50.608316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:50 crc kubenswrapper[4909]: E1002 18:19:50.608376 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:50 crc kubenswrapper[4909]: I1002 18:19:50.608458 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:50 crc kubenswrapper[4909]: E1002 18:19:50.608528 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:50 crc kubenswrapper[4909]: E1002 18:19:50.608665 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:52 crc kubenswrapper[4909]: I1002 18:19:52.608243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:52 crc kubenswrapper[4909]: I1002 18:19:52.608363 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:52 crc kubenswrapper[4909]: I1002 18:19:52.608423 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:52 crc kubenswrapper[4909]: E1002 18:19:52.609854 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:52 crc kubenswrapper[4909]: E1002 18:19:52.609285 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:52 crc kubenswrapper[4909]: I1002 18:19:52.608525 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:52 crc kubenswrapper[4909]: E1002 18:19:52.610197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:52 crc kubenswrapper[4909]: E1002 18:19:52.610557 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:54 crc kubenswrapper[4909]: I1002 18:19:54.607879 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:54 crc kubenswrapper[4909]: I1002 18:19:54.607981 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:54 crc kubenswrapper[4909]: I1002 18:19:54.608104 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:54 crc kubenswrapper[4909]: I1002 18:19:54.608142 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:54 crc kubenswrapper[4909]: E1002 18:19:54.608019 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:54 crc kubenswrapper[4909]: E1002 18:19:54.608225 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:54 crc kubenswrapper[4909]: E1002 18:19:54.608380 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:54 crc kubenswrapper[4909]: E1002 18:19:54.608431 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:55 crc kubenswrapper[4909]: I1002 18:19:55.317743 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/1.log" Oct 02 18:19:55 crc kubenswrapper[4909]: I1002 18:19:55.318759 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/0.log" Oct 02 18:19:55 crc kubenswrapper[4909]: I1002 18:19:55.318855 4909 generic.go:334] "Generic (PLEG): container finished" podID="c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e" containerID="5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface" exitCode=1 Oct 02 18:19:55 crc kubenswrapper[4909]: I1002 18:19:55.318908 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerDied","Data":"5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface"} Oct 02 18:19:55 crc kubenswrapper[4909]: I1002 18:19:55.318966 4909 scope.go:117] "RemoveContainer" containerID="b2715360a54ec243f3067daaf058964521b72c4a7e096ab618198f953be93bcd" Oct 02 18:19:55 crc kubenswrapper[4909]: I1002 18:19:55.319810 4909 scope.go:117] "RemoveContainer" containerID="5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface" Oct 02 18:19:55 crc kubenswrapper[4909]: E1002 18:19:55.320058 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7gpnt_openshift-multus(c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e)\"" pod="openshift-multus/multus-7gpnt" podUID="c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e" Oct 02 18:19:56 crc kubenswrapper[4909]: I1002 18:19:56.324776 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/1.log" Oct 02 18:19:56 crc kubenswrapper[4909]: I1002 18:19:56.607975 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:56 crc kubenswrapper[4909]: I1002 18:19:56.608434 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:56 crc kubenswrapper[4909]: I1002 18:19:56.608424 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:56 crc kubenswrapper[4909]: I1002 18:19:56.608385 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:56 crc kubenswrapper[4909]: E1002 18:19:56.608559 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:56 crc kubenswrapper[4909]: E1002 18:19:56.609607 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:56 crc kubenswrapper[4909]: E1002 18:19:56.609671 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:56 crc kubenswrapper[4909]: E1002 18:19:56.609746 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:57 crc kubenswrapper[4909]: I1002 18:19:57.609439 4909 scope.go:117] "RemoveContainer" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.334720 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/3.log" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.337979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerStarted","Data":"902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee"} Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.338442 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.366524 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podStartSLOduration=97.366507528 podStartE2EDuration="1m37.366507528s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:19:58.365168187 +0000 UTC m=+119.552664046" watchObservedRunningTime="2025-10-02 18:19:58.366507528 +0000 UTC m=+119.554003377" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.610384 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.610432 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:58 crc kubenswrapper[4909]: E1002 18:19:58.610521 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.610384 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.610562 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:19:58 crc kubenswrapper[4909]: E1002 18:19:58.611260 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:19:58 crc kubenswrapper[4909]: E1002 18:19:58.611447 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:19:58 crc kubenswrapper[4909]: E1002 18:19:58.611599 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:58 crc kubenswrapper[4909]: I1002 18:19:58.617053 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wd57x"] Oct 02 18:19:59 crc kubenswrapper[4909]: I1002 18:19:59.341503 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:19:59 crc kubenswrapper[4909]: E1002 18:19:59.342019 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:19:59 crc kubenswrapper[4909]: E1002 18:19:59.600178 4909 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 18:19:59 crc kubenswrapper[4909]: E1002 18:19:59.709955 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:20:00 crc kubenswrapper[4909]: I1002 18:20:00.607983 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:00 crc kubenswrapper[4909]: I1002 18:20:00.608346 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:00 crc kubenswrapper[4909]: I1002 18:20:00.608388 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:00 crc kubenswrapper[4909]: I1002 18:20:00.608396 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:00 crc kubenswrapper[4909]: E1002 18:20:00.608501 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:00 crc kubenswrapper[4909]: E1002 18:20:00.608637 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:00 crc kubenswrapper[4909]: E1002 18:20:00.608746 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:00 crc kubenswrapper[4909]: E1002 18:20:00.608893 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:02 crc kubenswrapper[4909]: I1002 18:20:02.607947 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:02 crc kubenswrapper[4909]: I1002 18:20:02.607979 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:02 crc kubenswrapper[4909]: I1002 18:20:02.608003 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:02 crc kubenswrapper[4909]: E1002 18:20:02.608177 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:02 crc kubenswrapper[4909]: I1002 18:20:02.608322 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:02 crc kubenswrapper[4909]: E1002 18:20:02.608383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:02 crc kubenswrapper[4909]: E1002 18:20:02.608601 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:02 crc kubenswrapper[4909]: E1002 18:20:02.608732 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:03 crc kubenswrapper[4909]: I1002 18:20:03.100994 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:20:04 crc kubenswrapper[4909]: I1002 18:20:04.607851 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:04 crc kubenswrapper[4909]: I1002 18:20:04.607898 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:04 crc kubenswrapper[4909]: I1002 18:20:04.608063 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:04 crc kubenswrapper[4909]: I1002 18:20:04.608147 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:04 crc kubenswrapper[4909]: E1002 18:20:04.608197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:04 crc kubenswrapper[4909]: E1002 18:20:04.608400 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:04 crc kubenswrapper[4909]: E1002 18:20:04.608458 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:04 crc kubenswrapper[4909]: E1002 18:20:04.608513 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:04 crc kubenswrapper[4909]: E1002 18:20:04.711739 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:20:06 crc kubenswrapper[4909]: I1002 18:20:06.607645 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:06 crc kubenswrapper[4909]: I1002 18:20:06.607718 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:06 crc kubenswrapper[4909]: I1002 18:20:06.607930 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:06 crc kubenswrapper[4909]: E1002 18:20:06.607913 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:06 crc kubenswrapper[4909]: I1002 18:20:06.608134 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:06 crc kubenswrapper[4909]: E1002 18:20:06.608095 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:06 crc kubenswrapper[4909]: E1002 18:20:06.608227 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:06 crc kubenswrapper[4909]: E1002 18:20:06.608325 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:08 crc kubenswrapper[4909]: I1002 18:20:08.608426 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:08 crc kubenswrapper[4909]: I1002 18:20:08.608462 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:08 crc kubenswrapper[4909]: I1002 18:20:08.608521 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:08 crc kubenswrapper[4909]: I1002 18:20:08.608485 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:08 crc kubenswrapper[4909]: E1002 18:20:08.608611 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:08 crc kubenswrapper[4909]: E1002 18:20:08.608688 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:08 crc kubenswrapper[4909]: E1002 18:20:08.608842 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:08 crc kubenswrapper[4909]: E1002 18:20:08.608969 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:09 crc kubenswrapper[4909]: I1002 18:20:09.610422 4909 scope.go:117] "RemoveContainer" containerID="5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface" Oct 02 18:20:09 crc kubenswrapper[4909]: E1002 18:20:09.712564 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:20:10 crc kubenswrapper[4909]: I1002 18:20:10.383562 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/1.log" Oct 02 18:20:10 crc kubenswrapper[4909]: I1002 18:20:10.383647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerStarted","Data":"44f20c62d21719caf766d226880133866d4504c5d7cc78655ca0c3b1dc2b8f95"} Oct 02 18:20:10 crc kubenswrapper[4909]: I1002 18:20:10.608156 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:10 crc kubenswrapper[4909]: I1002 18:20:10.608176 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:10 crc kubenswrapper[4909]: E1002 18:20:10.608419 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:10 crc kubenswrapper[4909]: I1002 18:20:10.608223 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:10 crc kubenswrapper[4909]: E1002 18:20:10.608540 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:10 crc kubenswrapper[4909]: I1002 18:20:10.608210 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:10 crc kubenswrapper[4909]: E1002 18:20:10.608690 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:10 crc kubenswrapper[4909]: E1002 18:20:10.608777 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:12 crc kubenswrapper[4909]: I1002 18:20:12.608212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:12 crc kubenswrapper[4909]: I1002 18:20:12.608284 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:12 crc kubenswrapper[4909]: I1002 18:20:12.608344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:12 crc kubenswrapper[4909]: I1002 18:20:12.608252 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:12 crc kubenswrapper[4909]: E1002 18:20:12.608430 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:12 crc kubenswrapper[4909]: E1002 18:20:12.608569 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:12 crc kubenswrapper[4909]: E1002 18:20:12.608743 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:12 crc kubenswrapper[4909]: E1002 18:20:12.608952 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:14 crc kubenswrapper[4909]: I1002 18:20:14.607781 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:14 crc kubenswrapper[4909]: I1002 18:20:14.607892 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:14 crc kubenswrapper[4909]: I1002 18:20:14.607954 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:14 crc kubenswrapper[4909]: I1002 18:20:14.607971 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:14 crc kubenswrapper[4909]: E1002 18:20:14.607911 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 18:20:14 crc kubenswrapper[4909]: E1002 18:20:14.608101 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wd57x" podUID="ccda21fa-5211-460d-b521-fc5c86673b73" Oct 02 18:20:14 crc kubenswrapper[4909]: E1002 18:20:14.608400 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 18:20:14 crc kubenswrapper[4909]: E1002 18:20:14.608619 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.608132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.608173 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.608207 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.608239 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.611167 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.611384 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.612240 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.612252 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.612788 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 18:20:16 crc kubenswrapper[4909]: I1002 18:20:16.613004 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.920597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.980457 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xhtq7"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.981079 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.981921 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.982634 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.983441 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-knwzk"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.984157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.984608 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.985300 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-265wz"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.985931 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.986444 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5bt4k"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.987183 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.987720 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.987743 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.988436 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.991549 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.991646 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.995691 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.997285 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gzl4v"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.998247 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c"] Oct 02 18:20:18 crc kubenswrapper[4909]: I1002 18:20:18.999162 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xbgsg"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:18.999974 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.000366 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.000646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.001285 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.003242 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.003701 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.007589 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.007704 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.007979 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.008088 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.008135 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.008221 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.017329 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.007989 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.020239 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.032708 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.033007 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.033439 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.033799 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.034233 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.035468 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.036072 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.036941 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.037774 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.055731 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.056299 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.056754 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058173 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058424 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058432 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058610 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058741 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058827 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-59pwn"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058933 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.058950 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059019 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059278 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059366 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059197 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f4f74"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059732 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059926 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059999 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060005 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.059949 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060105 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060322 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060430 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060523 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060591 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060653 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060708 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060758 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060390 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.060923 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.061002 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.061183 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.061377 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.061570 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.061783 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062294 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062414 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062560 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062629 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062705 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062759 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.062570 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.063323 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.063818 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.069883 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.072366 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.072850 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fgbzj"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.073523 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jlsmw"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.073943 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.073998 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.075018 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.075370 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8nk9p"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.075609 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.076280 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.076994 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.078813 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.082113 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.082581 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.082733 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.082824 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.083519 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kpmgk"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.083673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-config\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.083780 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.084014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcb7044-167e-4e31-83a9-cfb9345e5195-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xw5x7\" (UID: \"efcb7044-167e-4e31-83a9-cfb9345e5195\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.084176 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-config\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.084359 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspth\" (UniqueName: \"kubernetes.io/projected/df0eb575-3d34-41d8-b6ac-12225721c074-kube-api-access-gspth\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.085839 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b27hf\" (UniqueName: \"kubernetes.io/projected/9c09468b-4235-4ea8-97ba-e395ef79d16a-kube-api-access-b27hf\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.086012 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c09468b-4235-4ea8-97ba-e395ef79d16a-audit-dir\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.086180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4l2\" (UniqueName: \"kubernetes.io/projected/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-kube-api-access-cx4l2\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.086346 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-config\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.086511 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ed97fb-f718-4241-95c3-c438a6cee18a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.086690 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b82524c9-70dd-408b-a918-d1a27149732d-serving-cert\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.086888 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b82524c9-70dd-408b-a918-d1a27149732d-etcd-client\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.087104 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-image-import-ca\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102151 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-audit\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102235 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-etcd-client\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-auth-proxy-config\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102285 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-machine-approver-tls\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102306 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlhf\" (UniqueName: \"kubernetes.io/projected/b82524c9-70dd-408b-a918-d1a27149732d-kube-api-access-lhlhf\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-encryption-config\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102360 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-config\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9c09468b-4235-4ea8-97ba-e395ef79d16a-node-pullsecrets\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102403 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkhj\" (UniqueName: \"kubernetes.io/projected/efcb7044-167e-4e31-83a9-cfb9345e5195-kube-api-access-vpkhj\") pod \"cluster-samples-operator-665b6dd947-xw5x7\" (UID: \"efcb7044-167e-4e31-83a9-cfb9345e5195\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-serving-cert\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-client-ca\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102463 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df0eb575-3d34-41d8-b6ac-12225721c074-serving-cert\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102504 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-etcd-ca\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102538 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59ed97fb-f718-4241-95c3-c438a6cee18a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102564 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs7ln\" (UniqueName: \"kubernetes.io/projected/59ed97fb-f718-4241-95c3-c438a6cee18a-kube-api-access-qs7ln\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102587 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-etcd-serving-ca\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.102606 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-etcd-service-ca\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.104895 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-p4h8w"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.106994 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.084429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.108399 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.083861 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.083836 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.109394 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.109689 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.109754 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.120488 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.120854 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.121819 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.122101 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.122379 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.124633 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.124682 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.124679 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.124959 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.124636 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125118 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125224 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125333 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125379 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125447 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125492 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125507 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125612 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.125659 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.126410 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.126450 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.126525 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.126579 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.126655 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.126806 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.129683 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.129929 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.130610 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.130828 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.130992 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.133926 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.134399 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.137295 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.137413 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.137529 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.137555 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.137567 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.138219 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.139251 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.141114 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.142275 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.142497 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.143335 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.143736 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.144249 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.144459 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.145036 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.146075 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.148753 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.149015 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.149151 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.149393 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.149496 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-645jg"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.149700 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.149936 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.150764 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.151755 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.153665 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ktt4z"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.155893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.156152 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.157132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.157660 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.163159 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.166977 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.167685 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.168880 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.169284 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.170471 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-knwzk"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.172225 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jbrhg"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.172618 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.173807 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.183749 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.187328 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xhtq7"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.193327 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jlsmw"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.193685 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.195564 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.195607 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xbgsg"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.196666 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.197755 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.198637 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.199701 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-59pwn"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.200592 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.201559 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.202552 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203119 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-audit\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203148 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-image-import-ca\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203176 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203196 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-etcd-client\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06269dc9-bb99-4485-be56-b32d44f11f62-serving-cert\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203233 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-policies\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-auth-proxy-config\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bea17ea-bc16-4f51-a175-f6e46279fd9b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203276 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203326 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-machine-approver-tls\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203343 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlhf\" (UniqueName: \"kubernetes.io/projected/b82524c9-70dd-408b-a918-d1a27149732d-kube-api-access-lhlhf\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203378 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-encryption-config\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203395 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-config\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203413 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-dir\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203432 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9c09468b-4235-4ea8-97ba-e395ef79d16a-node-pullsecrets\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203450 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkhj\" (UniqueName: \"kubernetes.io/projected/efcb7044-167e-4e31-83a9-cfb9345e5195-kube-api-access-vpkhj\") pod \"cluster-samples-operator-665b6dd947-xw5x7\" (UID: \"efcb7044-167e-4e31-83a9-cfb9345e5195\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-client-ca\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-serving-cert\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203506 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203524 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljt9k\" (UniqueName: \"kubernetes.io/projected/3bea17ea-bc16-4f51-a175-f6e46279fd9b-kube-api-access-ljt9k\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203545 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df0eb575-3d34-41d8-b6ac-12225721c074-serving-cert\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203563 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203581 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-etcd-ca\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203600 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59ed97fb-f718-4241-95c3-c438a6cee18a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bea17ea-bc16-4f51-a175-f6e46279fd9b-metrics-tls\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203662 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs7ln\" (UniqueName: \"kubernetes.io/projected/59ed97fb-f718-4241-95c3-c438a6cee18a-kube-api-access-qs7ln\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-etcd-serving-ca\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-etcd-service-ca\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203721 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-config\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcb7044-167e-4e31-83a9-cfb9345e5195-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xw5x7\" (UID: \"efcb7044-167e-4e31-83a9-cfb9345e5195\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203760 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-service-ca-bundle\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspth\" (UniqueName: \"kubernetes.io/projected/df0eb575-3d34-41d8-b6ac-12225721c074-kube-api-access-gspth\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203811 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-config\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b27hf\" (UniqueName: \"kubernetes.io/projected/9c09468b-4235-4ea8-97ba-e395ef79d16a-kube-api-access-b27hf\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203850 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203870 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c09468b-4235-4ea8-97ba-e395ef79d16a-audit-dir\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203886 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bea17ea-bc16-4f51-a175-f6e46279fd9b-trusted-ca\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203906 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4l2\" (UniqueName: \"kubernetes.io/projected/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-kube-api-access-cx4l2\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-config\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.203998 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ed97fb-f718-4241-95c3-c438a6cee18a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjq9d\" (UniqueName: \"kubernetes.io/projected/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-kube-api-access-mjq9d\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204057 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204063 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-audit\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204077 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b82524c9-70dd-408b-a918-d1a27149732d-serving-cert\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204100 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmtw5\" (UniqueName: \"kubernetes.io/projected/06269dc9-bb99-4485-be56-b32d44f11f62-kube-api-access-mmtw5\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204119 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b82524c9-70dd-408b-a918-d1a27149732d-etcd-client\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204162 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-config\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204208 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204714 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-image-import-ca\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.204973 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8wr64"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.206068 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-auth-proxy-config\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.206292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.206592 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-config\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.207282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-config\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.208094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ed97fb-f718-4241-95c3-c438a6cee18a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.208343 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.208422 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9c09468b-4235-4ea8-97ba-e395ef79d16a-node-pullsecrets\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.208799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-etcd-ca\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.208988 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-config\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.209908 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-config\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.210359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-client-ca\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.210483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c09468b-4235-4ea8-97ba-e395ef79d16a-audit-dir\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.210679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df0eb575-3d34-41d8-b6ac-12225721c074-serving-cert\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.211396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c09468b-4235-4ea8-97ba-e395ef79d16a-etcd-serving-ca\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.211506 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-machine-approver-tls\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.211885 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b82524c9-70dd-408b-a918-d1a27149732d-etcd-service-ca\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212125 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b82524c9-70dd-408b-a918-d1a27149732d-serving-cert\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212243 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212263 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212275 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212287 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8nk9p"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212354 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5bt4k"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.212444 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.213509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59ed97fb-f718-4241-95c3-c438a6cee18a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.213543 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f4f74"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.213628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcb7044-167e-4e31-83a9-cfb9345e5195-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xw5x7\" (UID: \"efcb7044-167e-4e31-83a9-cfb9345e5195\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.213851 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-etcd-client\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.214882 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-encryption-config\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.214937 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.215575 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.215632 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b82524c9-70dd-408b-a918-d1a27149732d-etcd-client\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.216620 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-645jg"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.219183 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qkdhr"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.220611 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.220713 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.221308 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.222453 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.222857 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.223445 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.224492 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kpmgk"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.225792 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.226400 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c09468b-4235-4ea8-97ba-e395ef79d16a-serving-cert\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.226949 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.228099 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p4h8w"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.229080 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fgbzj"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.230110 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qkdhr"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.231121 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ktt4z"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.232285 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jbrhg"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.233643 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.234544 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sxts5"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.235906 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sxts5"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.236009 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.242582 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.263107 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.284104 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.285830 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-htdx9"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.286643 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.294460 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-htdx9"] Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.303172 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.304663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-dir\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.304818 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-dir\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.304823 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljt9k\" (UniqueName: \"kubernetes.io/projected/3bea17ea-bc16-4f51-a175-f6e46279fd9b-kube-api-access-ljt9k\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.304901 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bea17ea-bc16-4f51-a175-f6e46279fd9b-metrics-tls\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.304938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305003 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-service-ca-bundle\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bea17ea-bc16-4f51-a175-f6e46279fd9b-trusted-ca\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305223 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305267 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjq9d\" (UniqueName: \"kubernetes.io/projected/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-kube-api-access-mjq9d\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305348 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmtw5\" (UniqueName: \"kubernetes.io/projected/06269dc9-bb99-4485-be56-b32d44f11f62-kube-api-access-mmtw5\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305404 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305434 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-config\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305465 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305503 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06269dc9-bb99-4485-be56-b32d44f11f62-serving-cert\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305575 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-policies\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305640 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bea17ea-bc16-4f51-a175-f6e46279fd9b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305671 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.305714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.306144 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.308459 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.308959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-service-ca-bundle\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.309408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-policies\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.309470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.310730 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.311041 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bea17ea-bc16-4f51-a175-f6e46279fd9b-trusted-ca\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.315313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06269dc9-bb99-4485-be56-b32d44f11f62-config\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.316854 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.317893 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.319317 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.319864 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.320408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.323287 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06269dc9-bb99-4485-be56-b32d44f11f62-serving-cert\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.323257 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.324694 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bea17ea-bc16-4f51-a175-f6e46279fd9b-metrics-tls\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.324932 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.325069 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.325160 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.343371 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.364279 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.382838 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.404304 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.424064 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.443693 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.463418 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.483255 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.504351 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.523376 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.543865 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.564318 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.583931 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.604215 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.624650 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.644116 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.663816 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.684470 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.704260 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.744634 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.774916 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.783749 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.803990 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.833100 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.843455 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.883729 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.904652 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.924174 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.943825 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.964742 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 18:20:19 crc kubenswrapper[4909]: I1002 18:20:19.984185 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.003601 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.023935 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.043489 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.064200 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.084593 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.103646 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.123968 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.144851 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.161466 4909 request.go:700] Waited for 1.016465749s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.163822 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.184324 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.204192 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.224889 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.244091 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.263607 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.284942 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.303568 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.323343 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.343788 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.363991 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.385011 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.423636 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.429654 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.444250 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.463517 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.483845 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.503337 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.524126 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.544188 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.563179 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.583591 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.603738 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.624285 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.644452 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.663623 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.683812 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.703734 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.723183 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.743686 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.762791 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.809896 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlhf\" (UniqueName: \"kubernetes.io/projected/b82524c9-70dd-408b-a918-d1a27149732d-kube-api-access-lhlhf\") pod \"etcd-operator-b45778765-knwzk\" (UID: \"b82524c9-70dd-408b-a918-d1a27149732d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.828577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspth\" (UniqueName: \"kubernetes.io/projected/df0eb575-3d34-41d8-b6ac-12225721c074-kube-api-access-gspth\") pod \"controller-manager-879f6c89f-xhtq7\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.850327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b27hf\" (UniqueName: \"kubernetes.io/projected/9c09468b-4235-4ea8-97ba-e395ef79d16a-kube-api-access-b27hf\") pod \"apiserver-76f77b778f-5bt4k\" (UID: \"9c09468b-4235-4ea8-97ba-e395ef79d16a\") " pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.865756 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.875450 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkhj\" (UniqueName: \"kubernetes.io/projected/efcb7044-167e-4e31-83a9-cfb9345e5195-kube-api-access-vpkhj\") pod \"cluster-samples-operator-665b6dd947-xw5x7\" (UID: \"efcb7044-167e-4e31-83a9-cfb9345e5195\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.888500 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.890514 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs7ln\" (UniqueName: \"kubernetes.io/projected/59ed97fb-f718-4241-95c3-c438a6cee18a-kube-api-access-qs7ln\") pod \"openshift-apiserver-operator-796bbdcf4f-xlfhx\" (UID: \"59ed97fb-f718-4241-95c3-c438a6cee18a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.905908 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.911973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4l2\" (UniqueName: \"kubernetes.io/projected/dc1e28de-9ce9-4bec-afc0-a1df4509b9dc-kube-api-access-cx4l2\") pod \"machine-approver-56656f9798-265wz\" (UID: \"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.923551 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.930055 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.945151 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.964184 4909 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 18:20:20 crc kubenswrapper[4909]: I1002 18:20:20.987583 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.005390 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.024405 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.049395 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.065507 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.084014 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.102880 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.106071 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.118157 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-knwzk"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.123969 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 18:20:21 crc kubenswrapper[4909]: W1002 18:20:21.132763 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb82524c9_70dd_408b_a918_d1a27149732d.slice/crio-8970199858d577f7900dcd8ff4bf264f19e370aa4b08b52df6b8754fae96b94e WatchSource:0}: Error finding container 8970199858d577f7900dcd8ff4bf264f19e370aa4b08b52df6b8754fae96b94e: Status 404 returned error can't find the container with id 8970199858d577f7900dcd8ff4bf264f19e370aa4b08b52df6b8754fae96b94e Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.137464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.143444 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.162436 4909 request.go:700] Waited for 1.855216893s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/serviceaccounts/ingress-operator/token Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.165915 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.170240 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5bt4k"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.171662 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" Oct 02 18:20:21 crc kubenswrapper[4909]: W1002 18:20:21.187046 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ed97fb_f718_4241_95c3_c438a6cee18a.slice/crio-1ecb5f343eb66f88267c67b63e5cb99579f02df793929dad3782bc59adedba51 WatchSource:0}: Error finding container 1ecb5f343eb66f88267c67b63e5cb99579f02df793929dad3782bc59adedba51: Status 404 returned error can't find the container with id 1ecb5f343eb66f88267c67b63e5cb99579f02df793929dad3782bc59adedba51 Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.188954 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljt9k\" (UniqueName: \"kubernetes.io/projected/3bea17ea-bc16-4f51-a175-f6e46279fd9b-kube-api-access-ljt9k\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.208838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bea17ea-bc16-4f51-a175-f6e46279fd9b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xn25h\" (UID: \"3bea17ea-bc16-4f51-a175-f6e46279fd9b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.226666 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjq9d\" (UniqueName: \"kubernetes.io/projected/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-kube-api-access-mjq9d\") pod \"oauth-openshift-558db77b4-8nk9p\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.239328 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmtw5\" (UniqueName: \"kubernetes.io/projected/06269dc9-bb99-4485-be56-b32d44f11f62-kube-api-access-mmtw5\") pod \"authentication-operator-69f744f599-59pwn\" (UID: \"06269dc9-bb99-4485-be56-b32d44f11f62\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.268725 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.316956 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xhtq7"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.336006 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343712 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343761 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/858c7aa2-095b-46d8-ad52-798fce64d3e8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343791 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-oauth-config\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343808 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-service-ca\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343831 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hks\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-kube-api-access-85hks\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228eb0f9-069f-40ba-97f0-bc452b34aa28-config\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343895 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-etcd-client\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqlj\" (UniqueName: \"kubernetes.io/projected/2d362a1c-75bb-4778-a58b-cf43e02bac6b-kube-api-access-vhqlj\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343941 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-serving-cert\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343957 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-service-ca-bundle\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.343971 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228eb0f9-069f-40ba-97f0-bc452b34aa28-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344006 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-bound-sa-token\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344039 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b9cbd7-5e82-4510-9291-77e63d43f6d5-trusted-ca\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x6kr\" (UniqueName: \"kubernetes.io/projected/aea04f33-66f3-4724-b1cc-e37aa52a23b1-kube-api-access-6x6kr\") pod \"downloads-7954f5f757-f4f74\" (UID: \"aea04f33-66f3-4724-b1cc-e37aa52a23b1\") " pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrbc\" (UniqueName: \"kubernetes.io/projected/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-kube-api-access-msrbc\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344103 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c8d1462-46e9-4190-82f2-224aa63e60d8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344122 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfgh\" (UniqueName: \"kubernetes.io/projected/f13738b4-01c1-4f4e-97d7-264026988f3f-kube-api-access-kjfgh\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-config\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344155 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-config\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344191 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-images\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344206 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c7aa2-095b-46d8-ad52-798fce64d3e8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344220 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-encryption-config\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8c82\" (UniqueName: \"kubernetes.io/projected/1eb7c0e0-0413-4ea1-9b9d-253f31a0a162-kube-api-access-c8c82\") pod \"migrator-59844c95c7-v5mwt\" (UID: \"1eb7c0e0-0413-4ea1-9b9d-253f31a0a162\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-certificates\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344277 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-serving-cert\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-serving-cert\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-metrics-certs\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344335 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzrk\" (UniqueName: \"kubernetes.io/projected/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-kube-api-access-8rzrk\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93ad0590-e5cc-4429-8438-b2128b560d2e-audit-dir\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/616911ad-becd-4f1f-b626-7bdbf8371f55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-audit-policies\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffwm\" (UniqueName: \"kubernetes.io/projected/36b9cbd7-5e82-4510-9291-77e63d43f6d5-kube-api-access-tffwm\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344431 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/173bf52b-b4ff-4331-a35b-9e70cacfc411-metrics-tls\") pod \"dns-operator-744455d44c-jlsmw\" (UID: \"173bf52b-b4ff-4331-a35b-9e70cacfc411\") " pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344446 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-oauth-serving-cert\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-trusted-ca\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344525 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13738b4-01c1-4f4e-97d7-264026988f3f-proxy-tls\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344543 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344585 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgvf\" (UniqueName: \"kubernetes.io/projected/93ad0590-e5cc-4429-8438-b2128b560d2e-kube-api-access-kmgvf\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344647 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/616911ad-becd-4f1f-b626-7bdbf8371f55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36b9cbd7-5e82-4510-9291-77e63d43f6d5-serving-cert\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmtsr\" (UniqueName: \"kubernetes.io/projected/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-kube-api-access-fmtsr\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344764 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-stats-auth\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344785 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-tls\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344803 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-default-certificate\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344819 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36b9cbd7-5e82-4510-9291-77e63d43f6d5-config\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344843 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344903 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/228eb0f9-069f-40ba-97f0-bc452b34aa28-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/616911ad-becd-4f1f-b626-7bdbf8371f55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qngd\" (UniqueName: \"kubernetes.io/projected/858c7aa2-095b-46d8-ad52-798fce64d3e8-kube-api-access-8qngd\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344967 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc89r\" (UniqueName: \"kubernetes.io/projected/173bf52b-b4ff-4331-a35b-9e70cacfc411-kube-api-access-lc89r\") pod \"dns-operator-744455d44c-jlsmw\" (UID: \"173bf52b-b4ff-4331-a35b-9e70cacfc411\") " pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.344981 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.345014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f13738b4-01c1-4f4e-97d7-264026988f3f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.345052 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g26bl\" (UniqueName: \"kubernetes.io/projected/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-kube-api-access-g26bl\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.345069 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.345098 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.345139 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c8d1462-46e9-4190-82f2-224aa63e60d8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.345158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-trusted-ca-bundle\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.346413 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:21.846400886 +0000 UTC m=+143.033896745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: W1002 18:20:21.346778 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0eb575_3d34_41d8_b6ac_12225721c074.slice/crio-bf22872882d10fefdcf13697cbc6dc13a75ebf5711d5419d244605b2d02c5cee WatchSource:0}: Error finding container bf22872882d10fefdcf13697cbc6dc13a75ebf5711d5419d244605b2d02c5cee: Status 404 returned error can't find the container with id bf22872882d10fefdcf13697cbc6dc13a75ebf5711d5419d244605b2d02c5cee Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.360739 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.365730 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.445261 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" event={"ID":"df0eb575-3d34-41d8-b6ac-12225721c074","Type":"ContainerStarted","Data":"bf22872882d10fefdcf13697cbc6dc13a75ebf5711d5419d244605b2d02c5cee"} Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.445731 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446256 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/228eb0f9-069f-40ba-97f0-bc452b34aa28-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-node-bootstrap-token\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446338 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc89r\" (UniqueName: \"kubernetes.io/projected/173bf52b-b4ff-4331-a35b-9e70cacfc411-kube-api-access-lc89r\") pod \"dns-operator-744455d44c-jlsmw\" (UID: \"173bf52b-b4ff-4331-a35b-9e70cacfc411\") " pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446379 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/616911ad-becd-4f1f-b626-7bdbf8371f55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qngd\" (UniqueName: \"kubernetes.io/projected/858c7aa2-095b-46d8-ad52-798fce64d3e8-kube-api-access-8qngd\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446467 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g26bl\" (UniqueName: \"kubernetes.io/projected/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-kube-api-access-g26bl\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446487 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446538 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ncb\" (UniqueName: \"kubernetes.io/projected/ddb18400-1a15-48bb-a28e-27d19e0dd04c-kube-api-access-g9ncb\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2677c8a-f86e-4c9c-9c41-b59b38904cb4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-56w8v\" (UID: \"a2677c8a-f86e-4c9c-9c41-b59b38904cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446612 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-trusted-ca-bundle\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhx7\" (UniqueName: \"kubernetes.io/projected/a7cc0f59-db36-4e76-93fe-4c99f3e621a0-kube-api-access-jdhx7\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4pvv\" (UID: \"a7cc0f59-db36-4e76-93fe-4c99f3e621a0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c8d1462-46e9-4190-82f2-224aa63e60d8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446742 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-mountpoint-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/858c7aa2-095b-46d8-ad52-798fce64d3e8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446889 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.446915 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hks\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-kube-api-access-85hks\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.446983 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:21.946938361 +0000 UTC m=+143.134434210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.447042 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsdt\" (UniqueName: \"kubernetes.io/projected/37b24b0d-7804-44b7-a0e4-80c89580fd36-kube-api-access-7jsdt\") pod \"multus-admission-controller-857f4d67dd-ktt4z\" (UID: \"37b24b0d-7804-44b7-a0e4-80c89580fd36\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.457152 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.458688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85v4f\" (UniqueName: \"kubernetes.io/projected/bc0dec87-de8d-40c7-99ad-3e0f933885e6-kube-api-access-85v4f\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.458782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-plugins-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.458806 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d5670a8-418f-4523-ba53-dc5553cd867c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.459262 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/616911ad-becd-4f1f-b626-7bdbf8371f55-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.459778 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" event={"ID":"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc","Type":"ContainerStarted","Data":"5b560ff898f7b4a863981172f5693d963022233e1c19ab0964f03174088096ff"} Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.460677 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-trusted-ca-bundle\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.461247 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228eb0f9-069f-40ba-97f0-bc452b34aa28-config\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.461287 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-socket-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.461313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb18400-1a15-48bb-a28e-27d19e0dd04c-config-volume\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.461353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-etcd-client\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.461978 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-serving-cert\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.462610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228eb0f9-069f-40ba-97f0-bc452b34aa28-config\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.463795 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-bound-sa-token\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.463842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228eb0f9-069f-40ba-97f0-bc452b34aa28-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.463862 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b9cbd7-5e82-4510-9291-77e63d43f6d5-trusted-ca\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.464088 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8cac17e4-1168-4aa3-9609-03dbaf5f76df-tmpfs\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.464115 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f3c0da1-b7e1-4efb-9a99-cd839d704925-config-volume\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.464147 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d5670a8-418f-4523-ba53-dc5553cd867c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.464168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf82t\" (UniqueName: \"kubernetes.io/projected/b88a862d-943e-4321-9dac-8bc48701e1d6-kube-api-access-hf82t\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.464189 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x6kr\" (UniqueName: \"kubernetes.io/projected/aea04f33-66f3-4724-b1cc-e37aa52a23b1-kube-api-access-6x6kr\") pod \"downloads-7954f5f757-f4f74\" (UID: \"aea04f33-66f3-4724-b1cc-e37aa52a23b1\") " pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465067 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" event={"ID":"b82524c9-70dd-408b-a918-d1a27149732d","Type":"ContainerStarted","Data":"8970199858d577f7900dcd8ff4bf264f19e370aa4b08b52df6b8754fae96b94e"} Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465340 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b9cbd7-5e82-4510-9291-77e63d43f6d5-trusted-ca\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465440 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-config\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465490 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5vdt\" (UniqueName: \"kubernetes.io/projected/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-kube-api-access-g5vdt\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465513 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27121d66-e52c-4891-b50f-0509ccf58b38-config\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465533 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465650 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-profile-collector-cert\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c7aa2-095b-46d8-ad52-798fce64d3e8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.465836 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj5r\" (UniqueName: \"kubernetes.io/projected/5f3c0da1-b7e1-4efb-9a99-cd839d704925-kube-api-access-4wj5r\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466192 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/af008f49-d73f-49c4-a3c7-b668fdfbce0f-signing-key\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-serving-cert\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c7aa2-095b-46d8-ad52-798fce64d3e8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466655 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-oauth-serving-cert\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466675 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-registration-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-config\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466712 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-serving-cert\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13738b4-01c1-4f4e-97d7-264026988f3f-proxy-tls\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466881 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" event={"ID":"59ed97fb-f718-4241-95c3-c438a6cee18a","Type":"ContainerStarted","Data":"646806090dbab023380df415421d7752359dc2c6289fc74b66c3e14990bb971b"} Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" event={"ID":"59ed97fb-f718-4241-95c3-c438a6cee18a","Type":"ContainerStarted","Data":"1ecb5f343eb66f88267c67b63e5cb99579f02df793929dad3782bc59adedba51"} Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466781 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.466995 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f3c0da1-b7e1-4efb-9a99-cd839d704925-metrics-tls\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467072 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54ch\" (UniqueName: \"kubernetes.io/projected/af008f49-d73f-49c4-a3c7-b668fdfbce0f-kube-api-access-j54ch\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467123 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgvf\" (UniqueName: \"kubernetes.io/projected/93ad0590-e5cc-4429-8438-b2128b560d2e-kube-api-access-kmgvf\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/616911ad-becd-4f1f-b626-7bdbf8371f55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddb18400-1a15-48bb-a28e-27d19e0dd04c-secret-volume\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmtsr\" (UniqueName: \"kubernetes.io/projected/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-kube-api-access-fmtsr\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467456 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36b9cbd7-5e82-4510-9291-77e63d43f6d5-serving-cert\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.467993 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.469679 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.469710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8mlh\" (UniqueName: \"kubernetes.io/projected/27121d66-e52c-4891-b50f-0509ccf58b38-kube-api-access-q8mlh\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.469728 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-client-ca\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.469781 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36b9cbd7-5e82-4510-9291-77e63d43f6d5-config\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.470176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-oauth-serving-cert\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.470687 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8cac17e4-1168-4aa3-9609-03dbaf5f76df-webhook-cert\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.470918 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-etcd-client\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.470947 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.470985 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-srv-cert\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f13738b4-01c1-4f4e-97d7-264026988f3f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pznz2\" (UniqueName: \"kubernetes.io/projected/a2677c8a-f86e-4c9c-9c41-b59b38904cb4-kube-api-access-pznz2\") pod \"package-server-manager-789f6589d5-56w8v\" (UID: \"a2677c8a-f86e-4c9c-9c41-b59b38904cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27121d66-e52c-4891-b50f-0509ccf58b38-serving-cert\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471162 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j85q\" (UniqueName: \"kubernetes.io/projected/2d5670a8-418f-4523-ba53-dc5553cd867c-kube-api-access-4j85q\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471382 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228eb0f9-069f-40ba-97f0-bc452b34aa28-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471456 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/858c7aa2-095b-46d8-ad52-798fce64d3e8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471462 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7cc0f59-db36-4e76-93fe-4c99f3e621a0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4pvv\" (UID: \"a7cc0f59-db36-4e76-93fe-4c99f3e621a0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471643 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471659 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" event={"ID":"9c09468b-4235-4ea8-97ba-e395ef79d16a","Type":"ContainerStarted","Data":"0c2aca5cb649bda5a15cf95320f507a666e7527ff5e502d867da8e8d87bbe4ed"} Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-oauth-config\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471726 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-service-ca\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471749 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13738b4-01c1-4f4e-97d7-264026988f3f-proxy-tls\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5bf2\" (UniqueName: \"kubernetes.io/projected/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-kube-api-access-j5bf2\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.471989 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88a862d-943e-4321-9dac-8bc48701e1d6-serving-cert\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472070 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-certs\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472108 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-config\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8cac17e4-1168-4aa3-9609-03dbaf5f76df-apiservice-cert\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f13738b4-01c1-4f4e-97d7-264026988f3f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472411 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-srv-cert\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqlj\" (UniqueName: \"kubernetes.io/projected/2d362a1c-75bb-4778-a58b-cf43e02bac6b-kube-api-access-vhqlj\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-service-ca\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-service-ca-bundle\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bff386cb-82c9-451c-af23-09930f9359e8-cert\") pod \"ingress-canary-sxts5\" (UID: \"bff386cb-82c9-451c-af23-09930f9359e8\") " pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472900 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrbc\" (UniqueName: \"kubernetes.io/projected/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-kube-api-access-msrbc\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37b24b0d-7804-44b7-a0e4-80c89580fd36-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ktt4z\" (UID: \"37b24b0d-7804-44b7-a0e4-80c89580fd36\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c8d1462-46e9-4190-82f2-224aa63e60d8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472955 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfgh\" (UniqueName: \"kubernetes.io/projected/f13738b4-01c1-4f4e-97d7-264026988f3f-kube-api-access-kjfgh\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472971 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-config\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.472988 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-csi-data-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-images\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473077 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-images\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473291 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-service-ca-bundle\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473326 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcgcx\" (UniqueName: \"kubernetes.io/projected/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-kube-api-access-pcgcx\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-certificates\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-serving-cert\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473356 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-serving-cert\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473540 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c8d1462-46e9-4190-82f2-224aa63e60d8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473794 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-encryption-config\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473823 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8c82\" (UniqueName: \"kubernetes.io/projected/1eb7c0e0-0413-4ea1-9b9d-253f31a0a162-kube-api-access-c8c82\") pod \"migrator-59844c95c7-v5mwt\" (UID: \"1eb7c0e0-0413-4ea1-9b9d-253f31a0a162\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473843 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmdkl\" (UniqueName: \"kubernetes.io/projected/8cac17e4-1168-4aa3-9609-03dbaf5f76df-kube-api-access-lmdkl\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473869 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-metrics-certs\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzrk\" (UniqueName: \"kubernetes.io/projected/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-kube-api-access-8rzrk\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473949 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473968 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/616911ad-becd-4f1f-b626-7bdbf8371f55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473984 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93ad0590-e5cc-4429-8438-b2128b560d2e-audit-dir\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.473999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-audit-policies\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tffwm\" (UniqueName: \"kubernetes.io/projected/36b9cbd7-5e82-4510-9291-77e63d43f6d5-kube-api-access-tffwm\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474047 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-trusted-ca\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474119 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/173bf52b-b4ff-4331-a35b-9e70cacfc411-metrics-tls\") pod \"dns-operator-744455d44c-jlsmw\" (UID: \"173bf52b-b4ff-4331-a35b-9e70cacfc411\") " pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474139 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-config\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-stats-auth\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474188 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-proxy-tls\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474204 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-default-certificate\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474222 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk97w\" (UniqueName: \"kubernetes.io/projected/c23b0961-4854-4dda-a5f6-861fb04604fa-kube-api-access-mk97w\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d5670a8-418f-4523-ba53-dc5553cd867c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-tls\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474316 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/af008f49-d73f-49c4-a3c7-b668fdfbce0f-signing-cabundle\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474334 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs97p\" (UniqueName: \"kubernetes.io/projected/bff386cb-82c9-451c-af23-09930f9359e8-kube-api-access-bs97p\") pod \"ingress-canary-sxts5\" (UID: \"bff386cb-82c9-451c-af23-09930f9359e8\") " pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474380 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrvz\" (UniqueName: \"kubernetes.io/projected/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-kube-api-access-mjrvz\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.474615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36b9cbd7-5e82-4510-9291-77e63d43f6d5-config\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.476505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93ad0590-e5cc-4429-8438-b2128b560d2e-audit-dir\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.476642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-images\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.476926 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-oauth-config\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.477702 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.477741 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-config\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.478408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qngd\" (UniqueName: \"kubernetes.io/projected/858c7aa2-095b-46d8-ad52-798fce64d3e8-kube-api-access-8qngd\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv97q\" (UID: \"858c7aa2-095b-46d8-ad52-798fce64d3e8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.478703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-trusted-ca\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.479710 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36b9cbd7-5e82-4510-9291-77e63d43f6d5-serving-cert\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.479728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.479731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c8d1462-46e9-4190-82f2-224aa63e60d8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.479868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93ad0590-e5cc-4429-8438-b2128b560d2e-audit-policies\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.480001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-stats-auth\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.481365 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-metrics-certs\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.482298 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-certificates\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.482892 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/616911ad-becd-4f1f-b626-7bdbf8371f55-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.482998 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-serving-cert\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.483969 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-tls\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.483970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.486316 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-default-certificate\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.489773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/173bf52b-b4ff-4331-a35b-9e70cacfc411-metrics-tls\") pod \"dns-operator-744455d44c-jlsmw\" (UID: \"173bf52b-b4ff-4331-a35b-9e70cacfc411\") " pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.496539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/93ad0590-e5cc-4429-8438-b2128b560d2e-encryption-config\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.500638 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc89r\" (UniqueName: \"kubernetes.io/projected/173bf52b-b4ff-4331-a35b-9e70cacfc411-kube-api-access-lc89r\") pod \"dns-operator-744455d44c-jlsmw\" (UID: \"173bf52b-b4ff-4331-a35b-9e70cacfc411\") " pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.504002 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.519465 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g26bl\" (UniqueName: \"kubernetes.io/projected/6e789d01-3b09-4c6c-9931-f1a1bc87ee4a-kube-api-access-g26bl\") pod \"router-default-5444994796-gzl4v\" (UID: \"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a\") " pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: W1002 18:20:21.522805 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bea17ea_bc16_4f51_a175_f6e46279fd9b.slice/crio-ff13edd3697c13014bd0a93abb9985b01a7303ae71b14be01c64ab2bf8b5d7fa WatchSource:0}: Error finding container ff13edd3697c13014bd0a93abb9985b01a7303ae71b14be01c64ab2bf8b5d7fa: Status 404 returned error can't find the container with id ff13edd3697c13014bd0a93abb9985b01a7303ae71b14be01c64ab2bf8b5d7fa Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.541031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hks\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-kube-api-access-85hks\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.559206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/228eb0f9-069f-40ba-97f0-bc452b34aa28-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5zs5c\" (UID: \"228eb0f9-069f-40ba-97f0-bc452b34aa28\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575119 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d5670a8-418f-4523-ba53-dc5553cd867c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf82t\" (UniqueName: \"kubernetes.io/projected/b88a862d-943e-4321-9dac-8bc48701e1d6-kube-api-access-hf82t\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f3c0da1-b7e1-4efb-9a99-cd839d704925-config-volume\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575228 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5vdt\" (UniqueName: \"kubernetes.io/projected/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-kube-api-access-g5vdt\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27121d66-e52c-4891-b50f-0509ccf58b38-config\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575274 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575295 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-profile-collector-cert\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj5r\" (UniqueName: \"kubernetes.io/projected/5f3c0da1-b7e1-4efb-9a99-cd839d704925-kube-api-access-4wj5r\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/af008f49-d73f-49c4-a3c7-b668fdfbce0f-signing-key\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575367 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-registration-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575403 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f3c0da1-b7e1-4efb-9a99-cd839d704925-metrics-tls\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575425 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54ch\" (UniqueName: \"kubernetes.io/projected/af008f49-d73f-49c4-a3c7-b668fdfbce0f-kube-api-access-j54ch\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddb18400-1a15-48bb-a28e-27d19e0dd04c-secret-volume\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575498 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575526 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8mlh\" (UniqueName: \"kubernetes.io/projected/27121d66-e52c-4891-b50f-0509ccf58b38-kube-api-access-q8mlh\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575547 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-client-ca\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575576 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8cac17e4-1168-4aa3-9609-03dbaf5f76df-webhook-cert\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-srv-cert\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575634 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pznz2\" (UniqueName: \"kubernetes.io/projected/a2677c8a-f86e-4c9c-9c41-b59b38904cb4-kube-api-access-pznz2\") pod \"package-server-manager-789f6589d5-56w8v\" (UID: \"a2677c8a-f86e-4c9c-9c41-b59b38904cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575655 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27121d66-e52c-4891-b50f-0509ccf58b38-serving-cert\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j85q\" (UniqueName: \"kubernetes.io/projected/2d5670a8-418f-4523-ba53-dc5553cd867c-kube-api-access-4j85q\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7cc0f59-db36-4e76-93fe-4c99f3e621a0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4pvv\" (UID: \"a7cc0f59-db36-4e76-93fe-4c99f3e621a0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575752 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88a862d-943e-4321-9dac-8bc48701e1d6-serving-cert\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575773 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5bf2\" (UniqueName: \"kubernetes.io/projected/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-kube-api-access-j5bf2\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575794 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-certs\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575814 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-config\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575836 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8cac17e4-1168-4aa3-9609-03dbaf5f76df-apiservice-cert\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575872 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-srv-cert\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bff386cb-82c9-451c-af23-09930f9359e8-cert\") pod \"ingress-canary-sxts5\" (UID: \"bff386cb-82c9-451c-af23-09930f9359e8\") " pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575955 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37b24b0d-7804-44b7-a0e4-80c89580fd36-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ktt4z\" (UID: \"37b24b0d-7804-44b7-a0e4-80c89580fd36\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.575985 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-csi-data-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576012 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-images\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576128 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcgcx\" (UniqueName: \"kubernetes.io/projected/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-kube-api-access-pcgcx\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmdkl\" (UniqueName: \"kubernetes.io/projected/8cac17e4-1168-4aa3-9609-03dbaf5f76df-kube-api-access-lmdkl\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-config\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-registration-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.576997 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27121d66-e52c-4891-b50f-0509ccf58b38-config\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.577660 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-csi-data-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.578356 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-images\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.578642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f3c0da1-b7e1-4efb-9a99-cd839d704925-config-volume\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579283 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-client-ca\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d5670a8-418f-4523-ba53-dc5553cd867c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579675 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-proxy-tls\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk97w\" (UniqueName: \"kubernetes.io/projected/c23b0961-4854-4dda-a5f6-861fb04604fa-kube-api-access-mk97w\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579720 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d5670a8-418f-4523-ba53-dc5553cd867c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579777 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs97p\" (UniqueName: \"kubernetes.io/projected/bff386cb-82c9-451c-af23-09930f9359e8-kube-api-access-bs97p\") pod \"ingress-canary-sxts5\" (UID: \"bff386cb-82c9-451c-af23-09930f9359e8\") " pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579796 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/af008f49-d73f-49c4-a3c7-b668fdfbce0f-signing-cabundle\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.579819 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrvz\" (UniqueName: \"kubernetes.io/projected/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-kube-api-access-mjrvz\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.580921 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.581743 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d5670a8-418f-4523-ba53-dc5553cd867c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.581936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/af008f49-d73f-49c4-a3c7-b668fdfbce0f-signing-cabundle\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.582192 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-node-bootstrap-token\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.582883 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ncb\" (UniqueName: \"kubernetes.io/projected/ddb18400-1a15-48bb-a28e-27d19e0dd04c-kube-api-access-g9ncb\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2677c8a-f86e-4c9c-9c41-b59b38904cb4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-56w8v\" (UID: \"a2677c8a-f86e-4c9c-9c41-b59b38904cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583175 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhx7\" (UniqueName: \"kubernetes.io/projected/a7cc0f59-db36-4e76-93fe-4c99f3e621a0-kube-api-access-jdhx7\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4pvv\" (UID: \"a7cc0f59-db36-4e76-93fe-4c99f3e621a0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583197 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-mountpoint-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583319 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583347 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsdt\" (UniqueName: \"kubernetes.io/projected/37b24b0d-7804-44b7-a0e4-80c89580fd36-kube-api-access-7jsdt\") pod \"multus-admission-controller-857f4d67dd-ktt4z\" (UID: \"37b24b0d-7804-44b7-a0e4-80c89580fd36\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583367 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85v4f\" (UniqueName: \"kubernetes.io/projected/bc0dec87-de8d-40c7-99ad-3e0f933885e6-kube-api-access-85v4f\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583387 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-plugins-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583403 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d5670a8-418f-4523-ba53-dc5553cd867c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-socket-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583438 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb18400-1a15-48bb-a28e-27d19e0dd04c-config-volume\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583481 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8cac17e4-1168-4aa3-9609-03dbaf5f76df-tmpfs\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-proxy-tls\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.583890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8cac17e4-1168-4aa3-9609-03dbaf5f76df-tmpfs\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.584897 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-config\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.585007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-config\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.585177 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-socket-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.586014 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb18400-1a15-48bb-a28e-27d19e0dd04c-config-volume\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.586080 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-plugins-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.586103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c23b0961-4854-4dda-a5f6-861fb04604fa-mountpoint-dir\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.586357 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.586703 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.086589331 +0000 UTC m=+143.274085190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.589349 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.590532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-node-bootstrap-token\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.590957 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.596088 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7cc0f59-db36-4e76-93fe-4c99f3e621a0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4pvv\" (UID: \"a7cc0f59-db36-4e76-93fe-4c99f3e621a0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.597222 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.597235 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8cac17e4-1168-4aa3-9609-03dbaf5f76df-apiservice-cert\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.597497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37b24b0d-7804-44b7-a0e4-80c89580fd36-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ktt4z\" (UID: \"37b24b0d-7804-44b7-a0e4-80c89580fd36\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.597659 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-srv-cert\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.598066 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88a862d-943e-4321-9dac-8bc48701e1d6-serving-cert\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.598067 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.598240 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bff386cb-82c9-451c-af23-09930f9359e8-cert\") pod \"ingress-canary-sxts5\" (UID: \"bff386cb-82c9-451c-af23-09930f9359e8\") " pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.598510 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-certs\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.598712 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/af008f49-d73f-49c4-a3c7-b668fdfbce0f-signing-key\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.598763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f3c0da1-b7e1-4efb-9a99-cd839d704925-metrics-tls\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.601537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddb18400-1a15-48bb-a28e-27d19e0dd04c-secret-volume\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.604637 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.605009 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-profile-collector-cert\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.606367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27121d66-e52c-4891-b50f-0509ccf58b38-serving-cert\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.606476 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2677c8a-f86e-4c9c-9c41-b59b38904cb4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-56w8v\" (UID: \"a2677c8a-f86e-4c9c-9c41-b59b38904cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.607830 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8cac17e4-1168-4aa3-9609-03dbaf5f76df-webhook-cert\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.608504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-bound-sa-token\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.619742 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-srv-cert\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.626572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x6kr\" (UniqueName: \"kubernetes.io/projected/aea04f33-66f3-4724-b1cc-e37aa52a23b1-kube-api-access-6x6kr\") pod \"downloads-7954f5f757-f4f74\" (UID: \"aea04f33-66f3-4724-b1cc-e37aa52a23b1\") " pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.653536 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.654209 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgvf\" (UniqueName: \"kubernetes.io/projected/93ad0590-e5cc-4429-8438-b2128b560d2e-kube-api-access-kmgvf\") pod \"apiserver-7bbb656c7d-l89pq\" (UID: \"93ad0590-e5cc-4429-8438-b2128b560d2e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.661949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmtsr\" (UniqueName: \"kubernetes.io/projected/ce9d69ec-1859-40eb-8851-4e88d5fc7f37-kube-api-access-fmtsr\") pod \"openshift-config-operator-7777fb866f-qvn5d\" (UID: \"ce9d69ec-1859-40eb-8851-4e88d5fc7f37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.669823 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.679442 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.682347 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-59pwn"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.684620 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.685195 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.185176847 +0000 UTC m=+143.372672706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.686118 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqlj\" (UniqueName: \"kubernetes.io/projected/2d362a1c-75bb-4778-a58b-cf43e02bac6b-kube-api-access-vhqlj\") pod \"console-f9d7485db-p4h8w\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.687314 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.699443 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrbc\" (UniqueName: \"kubernetes.io/projected/4bcc6ed7-1c31-4831-8b45-2729aaa5f89c-kube-api-access-msrbc\") pod \"machine-api-operator-5694c8668f-fgbzj\" (UID: \"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.708590 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.728015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfgh\" (UniqueName: \"kubernetes.io/projected/f13738b4-01c1-4f4e-97d7-264026988f3f-kube-api-access-kjfgh\") pod \"machine-config-controller-84d6567774-9flkb\" (UID: \"f13738b4-01c1-4f4e-97d7-264026988f3f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.735388 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8nk9p"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.749706 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.755439 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffwm\" (UniqueName: \"kubernetes.io/projected/36b9cbd7-5e82-4510-9291-77e63d43f6d5-kube-api-access-tffwm\") pod \"console-operator-58897d9998-xbgsg\" (UID: \"36b9cbd7-5e82-4510-9291-77e63d43f6d5\") " pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.758475 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8c82\" (UniqueName: \"kubernetes.io/projected/1eb7c0e0-0413-4ea1-9b9d-253f31a0a162-kube-api-access-c8c82\") pod \"migrator-59844c95c7-v5mwt\" (UID: \"1eb7c0e0-0413-4ea1-9b9d-253f31a0a162\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" Oct 02 18:20:21 crc kubenswrapper[4909]: W1002 18:20:21.772309 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b563e9f_f5eb_4fb4_92c5_3f82e2ad43b6.slice/crio-a0dd6bcb7a7282dd345f134db6bf5be919b0c2c531fcfeefface1350a5836fe4 WatchSource:0}: Error finding container a0dd6bcb7a7282dd345f134db6bf5be919b0c2c531fcfeefface1350a5836fe4: Status 404 returned error can't find the container with id a0dd6bcb7a7282dd345f134db6bf5be919b0c2c531fcfeefface1350a5836fe4 Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.778104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzrk\" (UniqueName: \"kubernetes.io/projected/7e5df8ca-21df-468a-9ee3-4f98b3ed95e6-kube-api-access-8rzrk\") pod \"kube-storage-version-migrator-operator-b67b599dd-x5l2g\" (UID: \"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.786551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.787161 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.287144416 +0000 UTC m=+143.474640275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.799696 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/616911ad-becd-4f1f-b626-7bdbf8371f55-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j4dpc\" (UID: \"616911ad-becd-4f1f-b626-7bdbf8371f55\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.828416 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c"] Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.843470 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.854942 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.867800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5vdt\" (UniqueName: \"kubernetes.io/projected/42d8b133-dbdb-47ff-aaff-9f6b82fcad38-kube-api-access-g5vdt\") pod \"machine-config-server-8wr64\" (UID: \"42d8b133-dbdb-47ff-aaff-9f6b82fcad38\") " pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.881376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf82t\" (UniqueName: \"kubernetes.io/projected/b88a862d-943e-4321-9dac-8bc48701e1d6-kube-api-access-hf82t\") pod \"route-controller-manager-6576b87f9c-vsdsq\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.894108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.894297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54ch\" (UniqueName: \"kubernetes.io/projected/af008f49-d73f-49c4-a3c7-b668fdfbce0f-kube-api-access-j54ch\") pod \"service-ca-9c57cc56f-jbrhg\" (UID: \"af008f49-d73f-49c4-a3c7-b668fdfbce0f\") " pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.894703 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8wr64" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.894856 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.394835152 +0000 UTC m=+143.582331011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.895128 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.899204 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.399191557 +0000 UTC m=+143.586687416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.907224 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.907853 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj5r\" (UniqueName: \"kubernetes.io/projected/5f3c0da1-b7e1-4efb-9a99-cd839d704925-kube-api-access-4wj5r\") pod \"dns-default-htdx9\" (UID: \"5f3c0da1-b7e1-4efb-9a99-cd839d704925\") " pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.921325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8mlh\" (UniqueName: \"kubernetes.io/projected/27121d66-e52c-4891-b50f-0509ccf58b38-kube-api-access-q8mlh\") pod \"service-ca-operator-777779d784-7c4s8\" (UID: \"27121d66-e52c-4891-b50f-0509ccf58b38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:21 crc kubenswrapper[4909]: W1002 18:20:21.929741 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228eb0f9_069f_40ba_97f0_bc452b34aa28.slice/crio-282460d43525053fce86e953792721eef50c832d95622799164fdb4a8ab5ddbd WatchSource:0}: Error finding container 282460d43525053fce86e953792721eef50c832d95622799164fdb4a8ab5ddbd: Status 404 returned error can't find the container with id 282460d43525053fce86e953792721eef50c832d95622799164fdb4a8ab5ddbd Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.929998 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.948586 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.987663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pznz2\" (UniqueName: \"kubernetes.io/projected/a2677c8a-f86e-4c9c-9c41-b59b38904cb4-kube-api-access-pznz2\") pod \"package-server-manager-789f6589d5-56w8v\" (UID: \"a2677c8a-f86e-4c9c-9c41-b59b38904cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.995885 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.996338 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.496212634 +0000 UTC m=+143.683708513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.997014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" Oct 02 18:20:21 crc kubenswrapper[4909]: I1002 18:20:21.997112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:21 crc kubenswrapper[4909]: E1002 18:20:21.997652 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.497632808 +0000 UTC m=+143.685128667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.007034 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmdkl\" (UniqueName: \"kubernetes.io/projected/8cac17e4-1168-4aa3-9609-03dbaf5f76df-kube-api-access-lmdkl\") pod \"packageserver-d55dfcdfc-blmm6\" (UID: \"8cac17e4-1168-4aa3-9609-03dbaf5f76df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.038256 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j85q\" (UniqueName: \"kubernetes.io/projected/2d5670a8-418f-4523-ba53-dc5553cd867c-kube-api-access-4j85q\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.042536 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6fa9758-6958-41a4-bed8-61b9ead5b8f0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vd9vs\" (UID: \"c6fa9758-6958-41a4-bed8-61b9ead5b8f0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.047265 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcgcx\" (UniqueName: \"kubernetes.io/projected/3605f811-7ddc-4be1-9dd3-1d2af7fcdec2-kube-api-access-pcgcx\") pod \"catalog-operator-68c6474976-72wcs\" (UID: \"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.058339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.059114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs97p\" (UniqueName: \"kubernetes.io/projected/bff386cb-82c9-451c-af23-09930f9359e8-kube-api-access-bs97p\") pod \"ingress-canary-sxts5\" (UID: \"bff386cb-82c9-451c-af23-09930f9359e8\") " pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.060868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5bf2\" (UniqueName: \"kubernetes.io/projected/ef32ba74-fc1a-4af3-b2b9-207f4da636ad-kube-api-access-j5bf2\") pod \"olm-operator-6b444d44fb-v6cks\" (UID: \"ef32ba74-fc1a-4af3-b2b9-207f4da636ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.066568 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.087882 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrvz\" (UniqueName: \"kubernetes.io/projected/b1d614ec-9adb-4a95-b744-4ffced9f5f0e-kube-api-access-mjrvz\") pod \"machine-config-operator-74547568cd-ddkph\" (UID: \"b1d614ec-9adb-4a95-b744-4ffced9f5f0e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.090156 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.097899 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.098480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.098852 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.598837124 +0000 UTC m=+143.786332983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.101347 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ncb\" (UniqueName: \"kubernetes.io/projected/ddb18400-1a15-48bb-a28e-27d19e0dd04c-kube-api-access-g9ncb\") pod \"collect-profiles-29323815-2zsdp\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.107897 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.121939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhx7\" (UniqueName: \"kubernetes.io/projected/a7cc0f59-db36-4e76-93fe-4c99f3e621a0-kube-api-access-jdhx7\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4pvv\" (UID: \"a7cc0f59-db36-4e76-93fe-4c99f3e621a0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.132602 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.139015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk97w\" (UniqueName: \"kubernetes.io/projected/c23b0961-4854-4dda-a5f6-861fb04604fa-kube-api-access-mk97w\") pod \"csi-hostpathplugin-qkdhr\" (UID: \"c23b0961-4854-4dda-a5f6-861fb04604fa\") " pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.154874 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.158454 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.168974 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.180605 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d5670a8-418f-4523-ba53-dc5553cd867c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lqrnp\" (UID: \"2d5670a8-418f-4523-ba53-dc5553cd867c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.181413 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.184445 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsdt\" (UniqueName: \"kubernetes.io/projected/37b24b0d-7804-44b7-a0e4-80c89580fd36-kube-api-access-7jsdt\") pod \"multus-admission-controller-857f4d67dd-ktt4z\" (UID: \"37b24b0d-7804-44b7-a0e4-80c89580fd36\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.203882 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.204282 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.70426928 +0000 UTC m=+143.891765139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.205350 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85v4f\" (UniqueName: \"kubernetes.io/projected/bc0dec87-de8d-40c7-99ad-3e0f933885e6-kube-api-access-85v4f\") pod \"marketplace-operator-79b997595-645jg\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.212756 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.221304 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sxts5" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.307083 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.307484 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.807449337 +0000 UTC m=+143.994945196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.338556 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d"] Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.375169 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.383704 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.409224 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.412663 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:22.912645017 +0000 UTC m=+144.100140876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.414549 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.422092 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.438453 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.514597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" event={"ID":"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc","Type":"ContainerStarted","Data":"4587489b51597e15d024312db480fba8433c7d2d0061fc2d058ecc8c89aa5165"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.514649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" event={"ID":"dc1e28de-9ce9-4bec-afc0-a1df4509b9dc","Type":"ContainerStarted","Data":"0f1d8c2847f2628a46da6e35c56fbc1ac55afe8a9cfe922d72d7eaf8aa4020e7"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.515048 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.515505 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.015482733 +0000 UTC m=+144.202978592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.553735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" event={"ID":"06269dc9-bb99-4485-be56-b32d44f11f62","Type":"ContainerStarted","Data":"e9a54576e185fa8118b62bef11b0b7b1b024a1df175eb7374502ec10d9f34004"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.553793 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" event={"ID":"06269dc9-bb99-4485-be56-b32d44f11f62","Type":"ContainerStarted","Data":"04e364eeacf1759c23b9634fd7079009a9bb684190a07a4101f720f98492b0e4"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.562748 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" event={"ID":"efcb7044-167e-4e31-83a9-cfb9345e5195","Type":"ContainerStarted","Data":"65f2f05d2cd9fb34234bb1e41938396af8c8d86174f1b0b4b3067315f1704aa1"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.562810 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" event={"ID":"efcb7044-167e-4e31-83a9-cfb9345e5195","Type":"ContainerStarted","Data":"f382774af5e2570d0779826b9a510134d114d038c2471135a02d2127a1c8323a"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.562826 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" event={"ID":"efcb7044-167e-4e31-83a9-cfb9345e5195","Type":"ContainerStarted","Data":"99303bedfddcd69007a1d9d94b24e33107dd4dfd19f2437c08689f1cc5e4ff5e"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.572667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" event={"ID":"df0eb575-3d34-41d8-b6ac-12225721c074","Type":"ContainerStarted","Data":"8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.573785 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.574603 4909 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xhtq7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.574662 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" podUID="df0eb575-3d34-41d8-b6ac-12225721c074" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.592888 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq"] Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.602172 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8wr64" event={"ID":"42d8b133-dbdb-47ff-aaff-9f6b82fcad38","Type":"ContainerStarted","Data":"c113d5b7a85dd225e4374431e661a75f5c38fa5a4458b54389a691e8b26988f9"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.602226 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8wr64" event={"ID":"42d8b133-dbdb-47ff-aaff-9f6b82fcad38","Type":"ContainerStarted","Data":"8f730686ee15ead73046d4c3a98bd3b636426836efeb57a54556671fcb0fa3a1"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.605109 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jlsmw"] Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.609793 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" event={"ID":"3bea17ea-bc16-4f51-a175-f6e46279fd9b","Type":"ContainerStarted","Data":"88872269c539b842dfe94c45ee438801a383238e3753ed3e05e129a10a2e00b7"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.610620 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" event={"ID":"3bea17ea-bc16-4f51-a175-f6e46279fd9b","Type":"ContainerStarted","Data":"357a94e7549fd01cc721e3dd90de025ac5427b351378b1df93bcd5d51132447c"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.610654 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" event={"ID":"3bea17ea-bc16-4f51-a175-f6e46279fd9b","Type":"ContainerStarted","Data":"ff13edd3697c13014bd0a93abb9985b01a7303ae71b14be01c64ab2bf8b5d7fa"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.616559 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.618368 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.11835386 +0000 UTC m=+144.305849719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.620300 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" event={"ID":"228eb0f9-069f-40ba-97f0-bc452b34aa28","Type":"ContainerStarted","Data":"282460d43525053fce86e953792721eef50c832d95622799164fdb4a8ab5ddbd"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.624950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gzl4v" event={"ID":"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a","Type":"ContainerStarted","Data":"c0b2769cae1a32788b8706cc85553a67ba412eaac3bf4fb7a80532bc8c222400"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.625022 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gzl4v" event={"ID":"6e789d01-3b09-4c6c-9931-f1a1bc87ee4a","Type":"ContainerStarted","Data":"ac7fe688e01b8f07c06ba0a93ca3c493a705e635bede1ab6194c00d2c44482a9"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.627186 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" event={"ID":"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6","Type":"ContainerStarted","Data":"39e190b1dd664f6241702990ce8a669e33d1cd83e4b5dded009dd47ab7ff579e"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.627232 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" event={"ID":"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6","Type":"ContainerStarted","Data":"a0dd6bcb7a7282dd345f134db6bf5be919b0c2c531fcfeefface1350a5836fe4"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.627442 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.629410 4909 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8nk9p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.629705 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" podUID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.634671 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" event={"ID":"b82524c9-70dd-408b-a918-d1a27149732d","Type":"ContainerStarted","Data":"74c81a56121f1dde10249e154e5d8418bc4f6a11a2be73e357da8ed2bd38d29a"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.640317 4909 generic.go:334] "Generic (PLEG): container finished" podID="9c09468b-4235-4ea8-97ba-e395ef79d16a" containerID="c10557cf69ee9d7e93bbce6afb44597de572c374860e57e4459c7f9a0576d392" exitCode=0 Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.640432 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" event={"ID":"9c09468b-4235-4ea8-97ba-e395ef79d16a","Type":"ContainerDied","Data":"c10557cf69ee9d7e93bbce6afb44597de572c374860e57e4459c7f9a0576d392"} Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.717408 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.722210 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.222177616 +0000 UTC m=+144.409673475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.780924 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q"] Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.841210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.841819 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.34179892 +0000 UTC m=+144.529294779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:22 crc kubenswrapper[4909]: W1002 18:20:22.871002 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ad0590_e5cc_4429_8438_b2128b560d2e.slice/crio-28c05f0da78ff7a00bfbf59101fe98dfedc88561d488bd5d082d672290da19ab WatchSource:0}: Error finding container 28c05f0da78ff7a00bfbf59101fe98dfedc88561d488bd5d082d672290da19ab: Status 404 returned error can't find the container with id 28c05f0da78ff7a00bfbf59101fe98dfedc88561d488bd5d082d672290da19ab Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.940965 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xn25h" podStartSLOduration=121.940948182 podStartE2EDuration="2m1.940948182s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:22.938975682 +0000 UTC m=+144.126471541" watchObservedRunningTime="2025-10-02 18:20:22.940948182 +0000 UTC m=+144.128444041" Oct 02 18:20:22 crc kubenswrapper[4909]: I1002 18:20:22.945235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:22 crc kubenswrapper[4909]: E1002 18:20:22.946626 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.446580447 +0000 UTC m=+144.634076306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.047317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.048041 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.548002279 +0000 UTC m=+144.735498138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.055070 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.055123 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.149337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.149854 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.649834305 +0000 UTC m=+144.837330164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.203634 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f4f74"] Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.250573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.251233 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.751216776 +0000 UTC m=+144.938712635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.310654 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" podStartSLOduration=122.310634886 podStartE2EDuration="2m2.310634886s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.308622884 +0000 UTC m=+144.496118743" watchObservedRunningTime="2025-10-02 18:20:23.310634886 +0000 UTC m=+144.498130755" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.352561 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.353075 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.85301485 +0000 UTC m=+145.040510709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.454868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.455333 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:23.955314891 +0000 UTC m=+145.142810750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.464231 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xw5x7" podStartSLOduration=123.464217045 podStartE2EDuration="2m3.464217045s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.461851492 +0000 UTC m=+144.649347361" watchObservedRunningTime="2025-10-02 18:20:23.464217045 +0000 UTC m=+144.651712904" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.485468 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" podStartSLOduration=122.485435467 podStartE2EDuration="2m2.485435467s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.482376494 +0000 UTC m=+144.669872383" watchObservedRunningTime="2025-10-02 18:20:23.485435467 +0000 UTC m=+144.672931326" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.509927 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8wr64" podStartSLOduration=5.509889751 podStartE2EDuration="5.509889751s" podCreationTimestamp="2025-10-02 18:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.507859058 +0000 UTC m=+144.695354917" watchObservedRunningTime="2025-10-02 18:20:23.509889751 +0000 UTC m=+144.697385620" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.533152 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-59pwn" podStartSLOduration=122.533135997 podStartE2EDuration="2m2.533135997s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.530463855 +0000 UTC m=+144.717959714" watchObservedRunningTime="2025-10-02 18:20:23.533135997 +0000 UTC m=+144.720631856" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.556061 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.556643 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.05662569 +0000 UTC m=+145.244121549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.578407 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" podStartSLOduration=123.57838596 podStartE2EDuration="2m3.57838596s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.5520718 +0000 UTC m=+144.739567659" watchObservedRunningTime="2025-10-02 18:20:23.57838596 +0000 UTC m=+144.765881819" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.589341 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.635946 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlfhx" podStartSLOduration=122.635928952 podStartE2EDuration="2m2.635928952s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.605161254 +0000 UTC m=+144.792657113" watchObservedRunningTime="2025-10-02 18:20:23.635928952 +0000 UTC m=+144.823424811" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.636126 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gzl4v" podStartSLOduration=122.636122328 podStartE2EDuration="2m2.636122328s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.635421506 +0000 UTC m=+144.822917355" watchObservedRunningTime="2025-10-02 18:20:23.636122328 +0000 UTC m=+144.823618187" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.657837 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.658267 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.158255449 +0000 UTC m=+145.345751308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.685870 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce9d69ec-1859-40eb-8851-4e88d5fc7f37" containerID="2a6724f0a5d887e4044160ffb2164d4ba74de7e9547ca8329cc4b013c4a04399" exitCode=0 Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.704488 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-knwzk" podStartSLOduration=122.704456602 podStartE2EDuration="2m2.704456602s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.702650557 +0000 UTC m=+144.890146416" watchObservedRunningTime="2025-10-02 18:20:23.704456602 +0000 UTC m=+144.891952461" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.758713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.760134 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.260075124 +0000 UTC m=+145.447571003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" event={"ID":"93ad0590-e5cc-4429-8438-b2128b560d2e","Type":"ContainerStarted","Data":"28c05f0da78ff7a00bfbf59101fe98dfedc88561d488bd5d082d672290da19ab"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764228 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" event={"ID":"173bf52b-b4ff-4331-a35b-9e70cacfc411","Type":"ContainerStarted","Data":"365e72552b12b5263640eac5dfa6546fa1c9ced8dce567b0e5b0b4730123931f"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" event={"ID":"ce9d69ec-1859-40eb-8851-4e88d5fc7f37","Type":"ContainerDied","Data":"2a6724f0a5d887e4044160ffb2164d4ba74de7e9547ca8329cc4b013c4a04399"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" event={"ID":"ce9d69ec-1859-40eb-8851-4e88d5fc7f37","Type":"ContainerStarted","Data":"c0a062aaff912312ec184e289eba479702b3a10725fc96a7fc84f446efea36c5"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" event={"ID":"9c09468b-4235-4ea8-97ba-e395ef79d16a","Type":"ContainerStarted","Data":"9244f488fec21bac536a19b1f32ea3c9693142fadaa2f9ee0830a49c10cfe258"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764289 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f4f74" event={"ID":"aea04f33-66f3-4724-b1cc-e37aa52a23b1","Type":"ContainerStarted","Data":"df17e3957242a7f076ecd41666ee543f9652e0edd2674b51b568d3495f648fd9"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.764320 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5zs5c" event={"ID":"228eb0f9-069f-40ba-97f0-bc452b34aa28","Type":"ContainerStarted","Data":"8a7a4ebdc8c6e9a95bfdfbf0107e21bc96793fc05254a49b78c1e996da029a82"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.775814 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" event={"ID":"858c7aa2-095b-46d8-ad52-798fce64d3e8","Type":"ContainerStarted","Data":"4d62929aba907ab899e3ed5908f2de114ec37ddcbf46e3d03657d7a0a77c8664"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.775868 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" event={"ID":"858c7aa2-095b-46d8-ad52-798fce64d3e8","Type":"ContainerStarted","Data":"b33bf54e0e730003ee370faa67bd07d2dd35b00c6fa23d2e64f204d2cbb70b64"} Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.792831 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.863843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.865444 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.365429728 +0000 UTC m=+145.552925587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.970490 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:23 crc kubenswrapper[4909]: E1002 18:20:23.970885 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.470868065 +0000 UTC m=+145.658363924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:23 crc kubenswrapper[4909]: I1002 18:20:23.982743 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-265wz" podStartSLOduration=123.98272797999999 podStartE2EDuration="2m3.98272798s" podCreationTimestamp="2025-10-02 18:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:23.981603415 +0000 UTC m=+145.169099274" watchObservedRunningTime="2025-10-02 18:20:23.98272798 +0000 UTC m=+145.170223839" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.070265 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:24 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:24 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:24 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.070323 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.073092 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.081885 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.581866513 +0000 UTC m=+145.769362372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.127730 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv97q" podStartSLOduration=123.127696093 podStartE2EDuration="2m3.127696093s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:24.119047497 +0000 UTC m=+145.306543356" watchObservedRunningTime="2025-10-02 18:20:24.127696093 +0000 UTC m=+145.315191952" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.177567 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.178451 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.678430426 +0000 UTC m=+145.865926285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.290463 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.290773 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.790759444 +0000 UTC m=+145.978255303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.331611 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.395364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.395749 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.895732277 +0000 UTC m=+146.083228136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.497878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.498207 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:24.998193892 +0000 UTC m=+146.185689751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.500641 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.511926 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xbgsg"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.520641 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.595891 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:24 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:24 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:24 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.595950 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.598583 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.601266 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.601574 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.101560684 +0000 UTC m=+146.289056543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.609483 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-htdx9"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.635259 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p4h8w"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.676122 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fgbzj"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.678992 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.704934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.705714 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.205699171 +0000 UTC m=+146.393195030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.736119 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.749163 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.759227 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.763056 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.784075 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sxts5"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.784141 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.796991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" event={"ID":"1eb7c0e0-0413-4ea1-9b9d-253f31a0a162","Type":"ContainerStarted","Data":"f80af7dd6f5646437cc16491df496c6efecc664d29626b267eb4cf444b087a9f"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.806830 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.807157 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.307137164 +0000 UTC m=+146.494633033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.810954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-htdx9" event={"ID":"5f3c0da1-b7e1-4efb-9a99-cd839d704925","Type":"ContainerStarted","Data":"07969b8bbc404ea2737d90e2b68109236761f4a88fb33f37ff1d0929b1b22274"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.811558 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ktt4z"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.819075 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f4f74" event={"ID":"aea04f33-66f3-4724-b1cc-e37aa52a23b1","Type":"ContainerStarted","Data":"3be23b69ef2dc7ae8ae361869b1d420596f62bfe8ee707a160012b830e6a4db5"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.819974 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.822248 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-f4f74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.822280 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f4f74" podUID="aea04f33-66f3-4724-b1cc-e37aa52a23b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.831714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" event={"ID":"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c","Type":"ContainerStarted","Data":"f0e4eec9703baccf09efc30a0d29e0ee38007a7183a757d8d1e37f6ddcbec7ba"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.836343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.836400 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv"] Oct 02 18:20:24 crc kubenswrapper[4909]: W1002 18:20:24.837651 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef32ba74_fc1a_4af3_b2b9_207f4da636ad.slice/crio-1dfdfc438c69dcac9a3571c872d087b5ce85c7d87454acc0add6f01b39ec2945 WatchSource:0}: Error finding container 1dfdfc438c69dcac9a3571c872d087b5ce85c7d87454acc0add6f01b39ec2945: Status 404 returned error can't find the container with id 1dfdfc438c69dcac9a3571c872d087b5ce85c7d87454acc0add6f01b39ec2945 Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.871541 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f4f74" podStartSLOduration=123.871524266 podStartE2EDuration="2m3.871524266s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:24.868950357 +0000 UTC m=+146.056446216" watchObservedRunningTime="2025-10-02 18:20:24.871524266 +0000 UTC m=+146.059020115" Oct 02 18:20:24 crc kubenswrapper[4909]: W1002 18:20:24.908121 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b24b0d_7804_44b7_a0e4_80c89580fd36.slice/crio-085066fb311b4e5e48e5a45d1eb54f0acb6922c096e401571f6b8974081645b7 WatchSource:0}: Error finding container 085066fb311b4e5e48e5a45d1eb54f0acb6922c096e401571f6b8974081645b7: Status 404 returned error can't find the container with id 085066fb311b4e5e48e5a45d1eb54f0acb6922c096e401571f6b8974081645b7 Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.912390 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:24 crc kubenswrapper[4909]: E1002 18:20:24.913194 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.413176619 +0000 UTC m=+146.600672478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.922641 4909 generic.go:334] "Generic (PLEG): container finished" podID="93ad0590-e5cc-4429-8438-b2128b560d2e" containerID="a08fcd82a6e04ed984b6e5cd5116d8266cdcbbd427da046563fda903f425e985" exitCode=0 Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.922745 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" event={"ID":"93ad0590-e5cc-4429-8438-b2128b560d2e","Type":"ContainerDied","Data":"a08fcd82a6e04ed984b6e5cd5116d8266cdcbbd427da046563fda903f425e985"} Oct 02 18:20:24 crc kubenswrapper[4909]: W1002 18:20:24.922946 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb88a862d_943e_4321_9dac_8bc48701e1d6.slice/crio-2be11a3844183f711e996db9a6798dc41d9989c51e2e3c10c5d6d9e73860f8fa WatchSource:0}: Error finding container 2be11a3844183f711e996db9a6798dc41d9989c51e2e3c10c5d6d9e73860f8fa: Status 404 returned error can't find the container with id 2be11a3844183f711e996db9a6798dc41d9989c51e2e3c10c5d6d9e73860f8fa Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.946189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" event={"ID":"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6","Type":"ContainerStarted","Data":"49570e51f06b0e7ca76970557658337e6ff0d7a4208ca342b40120bead3ba7a7"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.946524 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.957482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" event={"ID":"ce9d69ec-1859-40eb-8851-4e88d5fc7f37","Type":"ContainerStarted","Data":"cc2c826c46207f01300db243b0f2ea80b095193154fe0582ac2e257b6d23cbfc"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.957544 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.961035 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-645jg"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.968143 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.981666 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jbrhg"] Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.988864 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" event={"ID":"9c09468b-4235-4ea8-97ba-e395ef79d16a","Type":"ContainerStarted","Data":"dc7c149b4316931c99428e85fa203677a7bde490ca18eee860f6609c89f9f8cb"} Oct 02 18:20:24 crc kubenswrapper[4909]: I1002 18:20:24.997252 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp"] Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:24.998347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" event={"ID":"36b9cbd7-5e82-4510-9291-77e63d43f6d5","Type":"ContainerStarted","Data":"bcfbd2fbdbbcdeac38a14d92920de1783990912268a54c700eaf7ce45950d1ff"} Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:24.999352 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.014353 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.014773 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" podStartSLOduration=124.014752917 podStartE2EDuration="2m4.014752917s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:25.013973262 +0000 UTC m=+146.201469121" watchObservedRunningTime="2025-10-02 18:20:25.014752917 +0000 UTC m=+146.202248776" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.015106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" event={"ID":"c6fa9758-6958-41a4-bed8-61b9ead5b8f0","Type":"ContainerStarted","Data":"308e976c89e8fc3b9c4ed38994cef9d3997f9c2c290b1083e9c108614564d70d"} Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.015257 4909 patch_prober.go:28] interesting pod/console-operator-58897d9998-xbgsg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.015292 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" podUID="36b9cbd7-5e82-4510-9291-77e63d43f6d5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.015865 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.515848181 +0000 UTC m=+146.703344040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: W1002 18:20:25.020294 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5670a8_418f_4523_ba53_dc5553cd867c.slice/crio-36caa26a8b13ccf4c1d9fb902b096f04d09cfcb8afffbf2c3eeee6c9b1693966 WatchSource:0}: Error finding container 36caa26a8b13ccf4c1d9fb902b096f04d09cfcb8afffbf2c3eeee6c9b1693966: Status 404 returned error can't find the container with id 36caa26a8b13ccf4c1d9fb902b096f04d09cfcb8afffbf2c3eeee6c9b1693966 Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.022914 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p4h8w" event={"ID":"2d362a1c-75bb-4778-a58b-cf43e02bac6b","Type":"ContainerStarted","Data":"47f8add027b46c9823b26e3a86b9faa02aad2a7fac54041a35fd289a1f7807e4"} Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.041536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" event={"ID":"616911ad-becd-4f1f-b626-7bdbf8371f55","Type":"ContainerStarted","Data":"c449cb2b8f9d6feae8c419782299bd4739d5b015ac6f84870b68d927015c153a"} Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.046591 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" podStartSLOduration=124.046564956 podStartE2EDuration="2m4.046564956s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:25.041914233 +0000 UTC m=+146.229410092" watchObservedRunningTime="2025-10-02 18:20:25.046564956 +0000 UTC m=+146.234060805" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.054601 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8"] Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.059148 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qkdhr"] Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.075248 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" event={"ID":"f13738b4-01c1-4f4e-97d7-264026988f3f","Type":"ContainerStarted","Data":"693f661fab9259f53ea84d24d266a5f3e6ae3109a95adfeedd981df0bd6a1f4b"} Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.084497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" event={"ID":"173bf52b-b4ff-4331-a35b-9e70cacfc411","Type":"ContainerStarted","Data":"e8732240c66df2fa67b532eaf573aae6bb97e4ba25873b0b554ff28c8553a6a6"} Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.090266 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" podStartSLOduration=124.090252661 podStartE2EDuration="2m4.090252661s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:25.089836269 +0000 UTC m=+146.277332128" watchObservedRunningTime="2025-10-02 18:20:25.090252661 +0000 UTC m=+146.277748510" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.118688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.123647 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.623632189 +0000 UTC m=+146.811128038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: W1002 18:20:25.171772 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27121d66_e52c_4891_b50f_0509ccf58b38.slice/crio-be7113eb3235dfa49d409ff26705f2798475778a90e5db7f7fe776f750066a4a WatchSource:0}: Error finding container be7113eb3235dfa49d409ff26705f2798475778a90e5db7f7fe776f750066a4a: Status 404 returned error can't find the container with id be7113eb3235dfa49d409ff26705f2798475778a90e5db7f7fe776f750066a4a Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.219449 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.219583 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.719548282 +0000 UTC m=+146.907044141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.219800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.220157 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.7201422 +0000 UTC m=+146.907638069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.329688 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.340572 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.840542307 +0000 UTC m=+147.028038166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.443236 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.444041 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:25.944009094 +0000 UTC m=+147.131504963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.553048 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.553338 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.05332363 +0000 UTC m=+147.240819489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.601950 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:25 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:25 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:25 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.602062 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.654748 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.655087 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.155073062 +0000 UTC m=+147.342568921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.756476 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.756867 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.256852677 +0000 UTC m=+147.444348536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.861527 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.861921 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.361901541 +0000 UTC m=+147.549397400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.891470 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.891524 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:25 crc kubenswrapper[4909]: I1002 18:20:25.965961 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:25 crc kubenswrapper[4909]: E1002 18:20:25.966894 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.466868302 +0000 UTC m=+147.654364161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.067717 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.068098 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.568085339 +0000 UTC m=+147.755581198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.140818 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" event={"ID":"b88a862d-943e-4321-9dac-8bc48701e1d6","Type":"ContainerStarted","Data":"2be11a3844183f711e996db9a6798dc41d9989c51e2e3c10c5d6d9e73860f8fa"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.174761 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.175446 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.675418774 +0000 UTC m=+147.862914633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.211206 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" event={"ID":"af008f49-d73f-49c4-a3c7-b668fdfbce0f","Type":"ContainerStarted","Data":"c3957e98794953ebce72f4332e982bf2883c6e2470af04ebefb4685b009642c7"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.269514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" event={"ID":"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2","Type":"ContainerStarted","Data":"66ba7ab9ced24d84f435922c710ef8b7a2a0a18f5a7ddbd88d35580491894f5f"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.269583 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" event={"ID":"3605f811-7ddc-4be1-9dd3-1d2af7fcdec2","Type":"ContainerStarted","Data":"9408bf3744814d0af5213a4d43d9ddc2cdb1c3c74ca934fe460671c929a58a79"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.274151 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.280540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.281060 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.781041566 +0000 UTC m=+147.968537425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.282926 4909 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-72wcs container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.282979 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" podUID="3605f811-7ddc-4be1-9dd3-1d2af7fcdec2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.305156 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" podStartSLOduration=125.305137748 podStartE2EDuration="2m5.305137748s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:25.125791836 +0000 UTC m=+146.313287695" watchObservedRunningTime="2025-10-02 18:20:26.305137748 +0000 UTC m=+147.492633607" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.306640 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" podStartSLOduration=125.306634514 podStartE2EDuration="2m5.306634514s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.306199211 +0000 UTC m=+147.493695080" watchObservedRunningTime="2025-10-02 18:20:26.306634514 +0000 UTC m=+147.494130373" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.313521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" event={"ID":"27121d66-e52c-4891-b50f-0509ccf58b38","Type":"ContainerStarted","Data":"be7113eb3235dfa49d409ff26705f2798475778a90e5db7f7fe776f750066a4a"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.322473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" event={"ID":"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c","Type":"ContainerStarted","Data":"086d506fc2851ddba8186a63ff1024c5ff3412fe13fa402082783191f1b6fa85"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.333758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-htdx9" event={"ID":"5f3c0da1-b7e1-4efb-9a99-cd839d704925","Type":"ContainerStarted","Data":"fd5f6675bfc8e685128c56ea156ce5e8cc5da401c44d8f8da714d8b4c9e2182a"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.353215 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" event={"ID":"c6fa9758-6958-41a4-bed8-61b9ead5b8f0","Type":"ContainerStarted","Data":"bd21c0f18a4f7900b9801e6e60e58a02c1fb45a4482d0b9f8242e6976383e8fc"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.375013 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" event={"ID":"7e5df8ca-21df-468a-9ee3-4f98b3ed95e6","Type":"ContainerStarted","Data":"9761875b81c7519151272b1a37a7cb85f857ce0cb3403f736c1dd7d8f396c236"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.390878 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.391676 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.891646702 +0000 UTC m=+148.079142561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.392635 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.393111 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:26.893096767 +0000 UTC m=+148.080592636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.393557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" event={"ID":"1eb7c0e0-0413-4ea1-9b9d-253f31a0a162","Type":"ContainerStarted","Data":"07340203c961c4bb793ffd9776faebdcc50cc18deafb188597718e5eed432e9d"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.405411 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" event={"ID":"2d5670a8-418f-4523-ba53-dc5553cd867c","Type":"ContainerStarted","Data":"36caa26a8b13ccf4c1d9fb902b096f04d09cfcb8afffbf2c3eeee6c9b1693966"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.432991 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x5l2g" podStartSLOduration=125.432960714 podStartE2EDuration="2m5.432960714s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.432412607 +0000 UTC m=+147.619908466" watchObservedRunningTime="2025-10-02 18:20:26.432960714 +0000 UTC m=+147.620456573" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.433711 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vd9vs" podStartSLOduration=125.433704196 podStartE2EDuration="2m5.433704196s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.404358873 +0000 UTC m=+147.591854732" watchObservedRunningTime="2025-10-02 18:20:26.433704196 +0000 UTC m=+147.621200055" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.447626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" event={"ID":"616911ad-becd-4f1f-b626-7bdbf8371f55","Type":"ContainerStarted","Data":"62c06bdf6b8a132b55c02be7a4a59e63a230d8b4a894f8ce5bb8d83d484d864b"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.452570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" event={"ID":"bc0dec87-de8d-40c7-99ad-3e0f933885e6","Type":"ContainerStarted","Data":"20fbb6f9d5ca2049a0835ad2dbe3b65242472e42eae97899f3100a141db97167"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.453389 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.456865 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-645jg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.456920 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.468444 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" event={"ID":"a2677c8a-f86e-4c9c-9c41-b59b38904cb4","Type":"ContainerStarted","Data":"533380e5724e2a20216c9043d87b10835f65eed8ab9e5cb84cd542e8723784f4"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.468517 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" event={"ID":"a2677c8a-f86e-4c9c-9c41-b59b38904cb4","Type":"ContainerStarted","Data":"45a7f1d7525f43d49cf8d754c11bb05510ed6b2cb693a74b5e05362aa5a56bf3"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.501043 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.502360 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.002344171 +0000 UTC m=+148.189840030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.502344 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j4dpc" podStartSLOduration=125.50231857 podStartE2EDuration="2m5.50231857s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.488604657 +0000 UTC m=+147.676100516" watchObservedRunningTime="2025-10-02 18:20:26.50231857 +0000 UTC m=+147.689814429" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.510440 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jlsmw" event={"ID":"173bf52b-b4ff-4331-a35b-9e70cacfc411","Type":"ContainerStarted","Data":"4cfb2be1de6561a95fdb0a95554ec4b1f023363157a06e81f884a67059f161eb"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.538618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p4h8w" event={"ID":"2d362a1c-75bb-4778-a58b-cf43e02bac6b","Type":"ContainerStarted","Data":"4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.572557 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" podStartSLOduration=125.572540492 podStartE2EDuration="2m5.572540492s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.571991494 +0000 UTC m=+147.759487363" watchObservedRunningTime="2025-10-02 18:20:26.572540492 +0000 UTC m=+147.760036351" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.578288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" event={"ID":"37b24b0d-7804-44b7-a0e4-80c89580fd36","Type":"ContainerStarted","Data":"085066fb311b4e5e48e5a45d1eb54f0acb6922c096e401571f6b8974081645b7"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.589818 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" event={"ID":"b1d614ec-9adb-4a95-b744-4ffced9f5f0e","Type":"ContainerStarted","Data":"e77657d0c39b3c61eb0d46cc59074e3f107194a6e4ad403199efbe191210a90d"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.589906 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" event={"ID":"b1d614ec-9adb-4a95-b744-4ffced9f5f0e","Type":"ContainerStarted","Data":"9a471050f02308696eb05c3724e20893dc6bfa68fdfa22870e5d33a6ae0d1ace"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.603263 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.607219 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.107194709 +0000 UTC m=+148.294690568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.607598 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:26 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:26 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:26 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.616736 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.623118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" event={"ID":"c23b0961-4854-4dda-a5f6-861fb04604fa","Type":"ContainerStarted","Data":"c24bd118e220fb4ed233b49734916da41e9514ff2ba681781fc96aff5fe41625"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.670088 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-p4h8w" podStartSLOduration=125.670068484 podStartE2EDuration="2m5.670068484s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.654556506 +0000 UTC m=+147.842052375" watchObservedRunningTime="2025-10-02 18:20:26.670068484 +0000 UTC m=+147.857564343" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.672698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" event={"ID":"ef32ba74-fc1a-4af3-b2b9-207f4da636ad","Type":"ContainerStarted","Data":"3d659c6c75e15fdd1de7ef9d8aae8b5ce2b382f7b8e8d1cf739218a5bd0091f3"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.678060 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.678085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" event={"ID":"ef32ba74-fc1a-4af3-b2b9-207f4da636ad","Type":"ContainerStarted","Data":"1dfdfc438c69dcac9a3571c872d087b5ce85c7d87454acc0add6f01b39ec2945"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.686454 4909 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-v6cks container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.686557 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" podUID="ef32ba74-fc1a-4af3-b2b9-207f4da636ad" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.700383 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" event={"ID":"36b9cbd7-5e82-4510-9291-77e63d43f6d5","Type":"ContainerStarted","Data":"2e1e89b3116c47f3dd087e1c55b21cefe976d64a3a4c78c263cc774501d6380a"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.705597 4909 patch_prober.go:28] interesting pod/console-operator-58897d9998-xbgsg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.705731 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" podUID="36b9cbd7-5e82-4510-9291-77e63d43f6d5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.707492 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" podStartSLOduration=125.707467456 podStartE2EDuration="2m5.707467456s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.706398883 +0000 UTC m=+147.893894742" watchObservedRunningTime="2025-10-02 18:20:26.707467456 +0000 UTC m=+147.894963315" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.711672 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sxts5" event={"ID":"bff386cb-82c9-451c-af23-09930f9359e8","Type":"ContainerStarted","Data":"a3a70d7c3b4fad91a87baf1bf2a19d84a1a5a999f4f23ca7476dda7e275f6127"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.716557 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.717514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" event={"ID":"ddb18400-1a15-48bb-a28e-27d19e0dd04c","Type":"ContainerStarted","Data":"376cf554512ad3aa8c7bb1fbf1d3efca165881fb9271f377e6a37c22d310b668"} Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.720520 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.220488967 +0000 UTC m=+148.407984826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.743909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" event={"ID":"a7cc0f59-db36-4e76-93fe-4c99f3e621a0","Type":"ContainerStarted","Data":"2e24d7fbb26e47cc4fa763cf38f0955f53e76e25025aac87fb2b7006a7405a0b"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.752634 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sxts5" podStartSLOduration=7.752610006 podStartE2EDuration="7.752610006s" podCreationTimestamp="2025-10-02 18:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.743456834 +0000 UTC m=+147.930952693" watchObservedRunningTime="2025-10-02 18:20:26.752610006 +0000 UTC m=+147.940105875" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.767122 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" event={"ID":"8cac17e4-1168-4aa3-9609-03dbaf5f76df","Type":"ContainerStarted","Data":"25d37fa779119221efd12297f8c9557f8a1324593fec29d5530ec7d8b2814c94"} Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.768290 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-f4f74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.768330 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f4f74" podUID="aea04f33-66f3-4724-b1cc-e37aa52a23b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.768558 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.792086 4909 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-blmm6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.792128 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" podUID="8cac17e4-1168-4aa3-9609-03dbaf5f76df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.795355 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" podStartSLOduration=125.795328062 podStartE2EDuration="2m5.795328062s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.792193755 +0000 UTC m=+147.979689614" watchObservedRunningTime="2025-10-02 18:20:26.795328062 +0000 UTC m=+147.982823921" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.827627 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.830431 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.330411752 +0000 UTC m=+148.517907611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.871431 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" podStartSLOduration=125.871403164 podStartE2EDuration="2m5.871403164s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.87030791 +0000 UTC m=+148.057803779" watchObservedRunningTime="2025-10-02 18:20:26.871403164 +0000 UTC m=+148.058899013" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.875695 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" podStartSLOduration=124.875668985 podStartE2EDuration="2m4.875668985s" podCreationTimestamp="2025-10-02 18:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:26.841504863 +0000 UTC m=+148.029000712" watchObservedRunningTime="2025-10-02 18:20:26.875668985 +0000 UTC m=+148.063164844" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.906827 4909 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5bt4k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]log ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]etcd ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/max-in-flight-filter ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 18:20:26 crc kubenswrapper[4909]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 18:20:26 crc kubenswrapper[4909]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 18:20:26 crc kubenswrapper[4909]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 18:20:26 crc kubenswrapper[4909]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 18:20:26 crc kubenswrapper[4909]: livez check failed Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.906984 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" podUID="9c09468b-4235-4ea8-97ba-e395ef79d16a" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:26 crc kubenswrapper[4909]: I1002 18:20:26.932505 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:26 crc kubenswrapper[4909]: E1002 18:20:26.934100 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.434076773 +0000 UTC m=+148.621572632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.034864 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.035283 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.535265849 +0000 UTC m=+148.722761708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.136158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.136367 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.636334361 +0000 UTC m=+148.823830220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.136512 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.136943 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.636909829 +0000 UTC m=+148.824405688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.238232 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.238441 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.738412504 +0000 UTC m=+148.925908363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.238944 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.239346 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.739320972 +0000 UTC m=+148.926816831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.340172 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.340385 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.840355073 +0000 UTC m=+149.027850932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.340719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.341210 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.84120059 +0000 UTC m=+149.028696449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.441334 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.441500 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.941470976 +0000 UTC m=+149.128966835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.441741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.442065 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:27.942050865 +0000 UTC m=+149.129546724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.543151 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.543344 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.043314432 +0000 UTC m=+149.230810291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.543414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.543695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.544119 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.044099396 +0000 UTC m=+149.231595255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.545869 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.593074 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:27 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:27 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:27 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.593171 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.645681 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.645894 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.14586857 +0000 UTC m=+149.333364429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.646347 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.646403 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.646490 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.646531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.647127 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.147102208 +0000 UTC m=+149.334598067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.652462 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.653145 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.653310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.699291 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qvn5d" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.727871 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.737157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.746117 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.746933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.747127 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.247090847 +0000 UTC m=+149.434586706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.747375 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.747812 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.247801399 +0000 UTC m=+149.435297258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.815972 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" event={"ID":"1eb7c0e0-0413-4ea1-9b9d-253f31a0a162","Type":"ContainerStarted","Data":"272019fdfe0f89123bd250faadc4c796bcd384d4a64d92a5f391f3fed6ab4c5d"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.820438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" event={"ID":"8cac17e4-1168-4aa3-9609-03dbaf5f76df","Type":"ContainerStarted","Data":"173cb07691a84e84c568fb84c51e7ebd93596d4037ce561edb2ec291d627dc52"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.828213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4pvv" event={"ID":"a7cc0f59-db36-4e76-93fe-4c99f3e621a0","Type":"ContainerStarted","Data":"7e903cae19bee95e36e1ce3e7131eab6cfe7ecb6e45870a7ff8f4fc0970d69f9"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.831784 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" event={"ID":"ddb18400-1a15-48bb-a28e-27d19e0dd04c","Type":"ContainerStarted","Data":"3f88b44b3eb5951b72d82d1da8b236f7b30e9bb8c2b8e41fb795bb20f6c60500"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.840866 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" event={"ID":"37b24b0d-7804-44b7-a0e4-80c89580fd36","Type":"ContainerStarted","Data":"c9c56ecfbb30d63d0af17abedc6f121c95ae790d6017379e4dc78e9825cfa20a"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.840937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" event={"ID":"37b24b0d-7804-44b7-a0e4-80c89580fd36","Type":"ContainerStarted","Data":"b94fd6bf111d1015a977dd767bc628b45f244875edcaf5adc4968447e13045f0"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.847192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" event={"ID":"b1d614ec-9adb-4a95-b744-4ffced9f5f0e","Type":"ContainerStarted","Data":"db12246c24a8aad62b2dc0c7d3a1a6b0ae58f6c779b100e5e8d73f8ee7b22202"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.848984 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.849718 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.349687436 +0000 UTC m=+149.537183295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.857227 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" event={"ID":"4bcc6ed7-1c31-4831-8b45-2729aaa5f89c","Type":"ContainerStarted","Data":"c9b3a4c5594a3fb1623d3cb09675dd083cbd9128d1fa2a9f38e64295a98cccf7"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.858691 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-v5mwt" podStartSLOduration=126.858665122 podStartE2EDuration="2m6.858665122s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:27.856843786 +0000 UTC m=+149.044339635" watchObservedRunningTime="2025-10-02 18:20:27.858665122 +0000 UTC m=+149.046160981" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.864723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" event={"ID":"af008f49-d73f-49c4-a3c7-b668fdfbce0f","Type":"ContainerStarted","Data":"88a84e91af9608b993a23a1bce778437a456e23fb7feee25afda2362d89ac4fa"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.867523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" event={"ID":"93ad0590-e5cc-4429-8438-b2128b560d2e","Type":"ContainerStarted","Data":"5234ca0e8eba21538a29517c9616f6739b371e1ae05a9ab76a9ff25de1038425"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.880811 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sxts5" event={"ID":"bff386cb-82c9-451c-af23-09930f9359e8","Type":"ContainerStarted","Data":"f750ff89d19f579acc4ae0e950f999e6b83d4aaae0c18ad77efe751484a7ca53"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.884659 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ddkph" podStartSLOduration=126.884639752 podStartE2EDuration="2m6.884639752s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:27.881612929 +0000 UTC m=+149.069108788" watchObservedRunningTime="2025-10-02 18:20:27.884639752 +0000 UTC m=+149.072135601" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.888448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" event={"ID":"f13738b4-01c1-4f4e-97d7-264026988f3f","Type":"ContainerStarted","Data":"ac8d58a780fbe476bf992a5099785d638124ab5828cd1f668731b692691d5b0d"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.888521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" event={"ID":"f13738b4-01c1-4f4e-97d7-264026988f3f","Type":"ContainerStarted","Data":"7d221b39d0935d1a4162acb63c3e00a91c5b9d40b8a49ebfa4ffd8b266c84a94"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.899536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-htdx9" event={"ID":"5f3c0da1-b7e1-4efb-9a99-cd839d704925","Type":"ContainerStarted","Data":"1ec591ba85016427c56f879b54ed0365efbf3c3d739098b468091893a9eaf666"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.900599 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.909873 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" event={"ID":"b88a862d-943e-4321-9dac-8bc48701e1d6","Type":"ContainerStarted","Data":"83d170020d3b48b19b95904733fb705249c724f2dee9f6299d72e30872769683"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.910443 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.916752 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ktt4z" podStartSLOduration=126.916741411 podStartE2EDuration="2m6.916741411s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:27.91511176 +0000 UTC m=+149.102607619" watchObservedRunningTime="2025-10-02 18:20:27.916741411 +0000 UTC m=+149.104237270" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.924672 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" event={"ID":"27121d66-e52c-4891-b50f-0509ccf58b38","Type":"ContainerStarted","Data":"f1bb2ef7a2179da9c7c375869fb6b7297221dca4097237c26099e6a8b30e264a"} Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.953296 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.957933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" event={"ID":"c23b0961-4854-4dda-a5f6-861fb04604fa","Type":"ContainerStarted","Data":"dfce2f54ca073333a30660c0a48fa99b12742ba8347d7a17694f5dcc56ac3b79"} Oct 02 18:20:27 crc kubenswrapper[4909]: E1002 18:20:27.959984 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.459957451 +0000 UTC m=+149.647453300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.969175 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jbrhg" podStartSLOduration=125.969096773 podStartE2EDuration="2m5.969096773s" podCreationTimestamp="2025-10-02 18:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:27.965130261 +0000 UTC m=+149.152626120" watchObservedRunningTime="2025-10-02 18:20:27.969096773 +0000 UTC m=+149.156592632" Oct 02 18:20:27 crc kubenswrapper[4909]: I1002 18:20:27.970220 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9flkb" podStartSLOduration=126.970214447 podStartE2EDuration="2m6.970214447s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:27.936576451 +0000 UTC m=+149.124072310" watchObservedRunningTime="2025-10-02 18:20:27.970214447 +0000 UTC m=+149.157710296" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.010937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" event={"ID":"a2677c8a-f86e-4c9c-9c41-b59b38904cb4","Type":"ContainerStarted","Data":"da383e65bdfa67be7e942ab2563b864fb9da0ba39b85a91287096e56c692f657"} Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.023730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" event={"ID":"2d5670a8-418f-4523-ba53-dc5553cd867c","Type":"ContainerStarted","Data":"b7cfc0e82a09cb61774f0bc9d8a0639e56061f1e6ce17d8178f3753d6dfc00e2"} Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.026915 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.066461 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" event={"ID":"bc0dec87-de8d-40c7-99ad-3e0f933885e6","Type":"ContainerStarted","Data":"b9233162b7819c9d16f557083e7a4e390d2134b943e9ac9682e0ded23421ea0f"} Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.073094 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.073559 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.573534949 +0000 UTC m=+149.761030808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.073696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.075254 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" podStartSLOduration=127.075211069 podStartE2EDuration="2m7.075211069s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.065361127 +0000 UTC m=+149.252856986" watchObservedRunningTime="2025-10-02 18:20:28.075211069 +0000 UTC m=+149.262706928" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.075599 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.575588511 +0000 UTC m=+149.763084360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.079599 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-645jg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.079656 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.112241 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-f4f74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.112295 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f4f74" podUID="aea04f33-66f3-4724-b1cc-e37aa52a23b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.119393 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-72wcs" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.119586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xbgsg" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.122972 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v6cks" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.164102 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" podStartSLOduration=126.164082496 podStartE2EDuration="2m6.164082496s" podCreationTimestamp="2025-10-02 18:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.162944341 +0000 UTC m=+149.350440220" watchObservedRunningTime="2025-10-02 18:20:28.164082496 +0000 UTC m=+149.351578355" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.176925 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.179053 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.679012605 +0000 UTC m=+149.866508464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.227122 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fgbzj" podStartSLOduration=127.227102626 podStartE2EDuration="2m7.227102626s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.226006283 +0000 UTC m=+149.413502152" watchObservedRunningTime="2025-10-02 18:20:28.227102626 +0000 UTC m=+149.414598485" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.281778 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.282395 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.782374939 +0000 UTC m=+149.969870798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.288854 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-htdx9" podStartSLOduration=9.288834917 podStartE2EDuration="9.288834917s" podCreationTimestamp="2025-10-02 18:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.287810766 +0000 UTC m=+149.475306625" watchObservedRunningTime="2025-10-02 18:20:28.288834917 +0000 UTC m=+149.476330776" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.347881 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" podStartSLOduration=126.347854964 podStartE2EDuration="2m6.347854964s" podCreationTimestamp="2025-10-02 18:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.346579605 +0000 UTC m=+149.534075464" watchObservedRunningTime="2025-10-02 18:20:28.347854964 +0000 UTC m=+149.535350823" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.350314 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7c4s8" podStartSLOduration=126.35030524 podStartE2EDuration="2m6.35030524s" podCreationTimestamp="2025-10-02 18:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.314197798 +0000 UTC m=+149.501693657" watchObservedRunningTime="2025-10-02 18:20:28.35030524 +0000 UTC m=+149.537801099" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.388548 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.388573 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.888551578 +0000 UTC m=+150.076047437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.389252 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.392007 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.891987433 +0000 UTC m=+150.079483292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.439289 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lqrnp" podStartSLOduration=127.43926946 podStartE2EDuration="2m7.43926946s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:28.408213893 +0000 UTC m=+149.595709752" watchObservedRunningTime="2025-10-02 18:20:28.43926946 +0000 UTC m=+149.626765319" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.496063 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.996038978 +0000 UTC m=+150.183534837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.494765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.497017 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.498629 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:28.998613307 +0000 UTC m=+150.186109166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.595270 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:28 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:28 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:28 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.595329 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.595852 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.599182 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.599599 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.099580745 +0000 UTC m=+150.287076604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.605332 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gr59x"] Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.606385 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.618792 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.632373 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gr59x"] Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.700507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-catalog-content\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.700920 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.700975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-utilities\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.700993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t422s\" (UniqueName: \"kubernetes.io/projected/076f2c01-2196-4903-939a-8cc350611bf7-kube-api-access-t422s\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.701346 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.201333519 +0000 UTC m=+150.388829378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: W1002 18:20:28.770734 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c4bb380e6bb9c801c71bdc79a39a6a63457367ff00dc0911f208cee463b40632 WatchSource:0}: Error finding container c4bb380e6bb9c801c71bdc79a39a6a63457367ff00dc0911f208cee463b40632: Status 404 returned error can't find the container with id c4bb380e6bb9c801c71bdc79a39a6a63457367ff00dc0911f208cee463b40632 Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.794176 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gsjh6"] Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.798343 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.803630 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.804901 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.805211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-catalog-content\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.805314 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-utilities\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.805345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t422s\" (UniqueName: \"kubernetes.io/projected/076f2c01-2196-4903-939a-8cc350611bf7-kube-api-access-t422s\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.807689 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.307654502 +0000 UTC m=+150.495150361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.808155 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-catalog-content\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.808182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-utilities\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.809571 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gsjh6"] Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.823348 4909 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-blmm6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.823412 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" podUID="8cac17e4-1168-4aa3-9609-03dbaf5f76df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.846581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t422s\" (UniqueName: \"kubernetes.io/projected/076f2c01-2196-4903-939a-8cc350611bf7-kube-api-access-t422s\") pod \"certified-operators-gr59x\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.907106 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.907152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-catalog-content\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.907173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-utilities\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.907237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f787\" (UniqueName: \"kubernetes.io/projected/16ab9591-71df-40ac-9815-5bcc4c8e5446-kube-api-access-4f787\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:28 crc kubenswrapper[4909]: E1002 18:20:28.907527 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.407514317 +0000 UTC m=+150.595010166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.973336 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bh4ft"] Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.975129 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:28 crc kubenswrapper[4909]: I1002 18:20:28.990318 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bh4ft"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.014621 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.014985 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f787\" (UniqueName: \"kubernetes.io/projected/16ab9591-71df-40ac-9815-5bcc4c8e5446-kube-api-access-4f787\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.015097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-catalog-content\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.015130 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-utilities\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.015623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-utilities\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.015737 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.515716838 +0000 UTC m=+150.703212697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.016700 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-catalog-content\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.017175 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.057817 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f787\" (UniqueName: \"kubernetes.io/projected/16ab9591-71df-40ac-9815-5bcc4c8e5446-kube-api-access-4f787\") pod \"community-operators-gsjh6\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.120226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smspx\" (UniqueName: \"kubernetes.io/projected/d38aa58e-ad87-4447-8b21-a260f54a41bc-kube-api-access-smspx\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.120297 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-utilities\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.120318 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-catalog-content\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.120349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.120772 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.620756093 +0000 UTC m=+150.808251952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.131354 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6c64437e3c82df9127aa0dca86e6b7295bd03c5027fb406eda1f6d7c4b4f5bd1"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.131439 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6a8eed286bbb91f26b285a52c9044c2841a168d8c286958953cbb79ae5c51249"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.138777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5f19a5b443231ddf2d5a0859935c76a2e14aa0cb75e9b261252fbad21f87e7f3"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.138827 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"500558ccad794f0d552a74dac0a6a96398c0358e680000b4b05727cf8fdb0146"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.143280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" event={"ID":"c23b0961-4854-4dda-a5f6-861fb04604fa","Type":"ContainerStarted","Data":"d5acf7479517655b09a1a7c3b0847d602a06aee0cc1c423ab7b477d852557c9c"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.144999 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3aa3119217ce029986bfc2fb85bf85a18aa57d9906c50efa18ba2882cc3f9f07"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.145182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c4bb380e6bb9c801c71bdc79a39a6a63457367ff00dc0911f208cee463b40632"} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.145425 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-645jg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.145492 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.166227 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ssdjj"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.168248 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.183799 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssdjj"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.185531 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.222443 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.222813 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.722780304 +0000 UTC m=+150.910276163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.227397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-catalog-content\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.227650 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6v2x\" (UniqueName: \"kubernetes.io/projected/31e86782-f422-42e5-b517-80cea89f46e3-kube-api-access-f6v2x\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.227835 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smspx\" (UniqueName: \"kubernetes.io/projected/d38aa58e-ad87-4447-8b21-a260f54a41bc-kube-api-access-smspx\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.227934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-utilities\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.227954 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-catalog-content\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.227984 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.228121 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-utilities\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.249282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-utilities\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.249607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-catalog-content\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.249892 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.749870878 +0000 UTC m=+150.937366737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.251526 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-blmm6" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.275876 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smspx\" (UniqueName: \"kubernetes.io/projected/d38aa58e-ad87-4447-8b21-a260f54a41bc-kube-api-access-smspx\") pod \"certified-operators-bh4ft\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.331631 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.331845 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-utilities\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.331897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-catalog-content\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.331946 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6v2x\" (UniqueName: \"kubernetes.io/projected/31e86782-f422-42e5-b517-80cea89f46e3-kube-api-access-f6v2x\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.332370 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.832348928 +0000 UTC m=+151.019844787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.332776 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-utilities\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.333050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-catalog-content\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.338326 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.369851 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6v2x\" (UniqueName: \"kubernetes.io/projected/31e86782-f422-42e5-b517-80cea89f46e3-kube-api-access-f6v2x\") pod \"community-operators-ssdjj\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.436424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.436934 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:29.936916677 +0000 UTC m=+151.124412546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.494682 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gr59x"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.497379 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.537576 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.537822 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:30.037807204 +0000 UTC m=+151.225303063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.560559 4909 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.596032 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:29 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:29 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:29 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.596094 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.639642 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.639982 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 18:20:30.13996969 +0000 UTC m=+151.327465549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kpmgk" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.651455 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gsjh6"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.726980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bh4ft"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.741084 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:29 crc kubenswrapper[4909]: E1002 18:20:29.741655 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 18:20:30.24163778 +0000 UTC m=+151.429133639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.815996 4909 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T18:20:29.560597535Z","Handler":null,"Name":""} Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.821132 4909 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.821178 4909 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.856663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.863924 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.863957 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.946731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kpmgk\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.957881 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.968261 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssdjj"] Oct 02 18:20:29 crc kubenswrapper[4909]: I1002 18:20:29.969252 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.143615 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.151912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" event={"ID":"c23b0961-4854-4dda-a5f6-861fb04604fa","Type":"ContainerStarted","Data":"119212c1dfd2698d35a6f07309c4d713078171fc3fcd81cee5a17e78e73571d5"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.152886 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssdjj" event={"ID":"31e86782-f422-42e5-b517-80cea89f46e3","Type":"ContainerStarted","Data":"dc0cacb83c8903a35402778ee886d9a401f708e64c4e5390cdd88af58ddd37fa"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.154651 4909 generic.go:334] "Generic (PLEG): container finished" podID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerID="52e3f305bc26154eda9e2e8fdc317b451a9e2d51d19c41d84bbbbe1a64b0dd9e" exitCode=0 Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.154721 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh4ft" event={"ID":"d38aa58e-ad87-4447-8b21-a260f54a41bc","Type":"ContainerDied","Data":"52e3f305bc26154eda9e2e8fdc317b451a9e2d51d19c41d84bbbbe1a64b0dd9e"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.154760 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh4ft" event={"ID":"d38aa58e-ad87-4447-8b21-a260f54a41bc","Type":"ContainerStarted","Data":"2cff69f2c28b4588ae8462732736f791c0d7227fadd4f67aa3e446885577838d"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.156555 4909 generic.go:334] "Generic (PLEG): container finished" podID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerID="bcf255961ce789802b8cc00bbdf9e96f050cd885256beb223974e2f9a1e73058" exitCode=0 Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.156640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gsjh6" event={"ID":"16ab9591-71df-40ac-9815-5bcc4c8e5446","Type":"ContainerDied","Data":"bcf255961ce789802b8cc00bbdf9e96f050cd885256beb223974e2f9a1e73058"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.156667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gsjh6" event={"ID":"16ab9591-71df-40ac-9815-5bcc4c8e5446","Type":"ContainerStarted","Data":"4386889c3f6d67e3f017e16a9b3c28756d71fdc739df9d755acb9be6a5305312"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.156722 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.159561 4909 generic.go:334] "Generic (PLEG): container finished" podID="076f2c01-2196-4903-939a-8cc350611bf7" containerID="5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2" exitCode=0 Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.159809 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr59x" event={"ID":"076f2c01-2196-4903-939a-8cc350611bf7","Type":"ContainerDied","Data":"5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.159835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr59x" event={"ID":"076f2c01-2196-4903-939a-8cc350611bf7","Type":"ContainerStarted","Data":"da4660d5432a9e554d162cb3ad31de89ce032e5d95311c4cf102f3d896221588"} Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.388233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kpmgk"] Oct 02 18:20:30 crc kubenswrapper[4909]: W1002 18:20:30.397094 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8d1462_46e9_4190_82f2_224aa63e60d8.slice/crio-84504889a05a34f987d2d73bd46df225d93acdf61386c3df42f4b226ac055f40 WatchSource:0}: Error finding container 84504889a05a34f987d2d73bd46df225d93acdf61386c3df42f4b226ac055f40: Status 404 returned error can't find the container with id 84504889a05a34f987d2d73bd46df225d93acdf61386c3df42f4b226ac055f40 Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.571516 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-chbqr"] Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.573179 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.575738 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.587502 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chbqr"] Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.596700 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:30 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:30 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:30 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.596744 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.675691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-utilities\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.675770 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shx6t\" (UniqueName: \"kubernetes.io/projected/f50c487d-2e9f-486b-a803-d9e4b7756c79-kube-api-access-shx6t\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.676014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-catalog-content\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.777841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-utilities\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.777911 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shx6t\" (UniqueName: \"kubernetes.io/projected/f50c487d-2e9f-486b-a803-d9e4b7756c79-kube-api-access-shx6t\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.777950 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-catalog-content\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.778500 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-catalog-content\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.778766 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-utilities\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.816336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shx6t\" (UniqueName: \"kubernetes.io/projected/f50c487d-2e9f-486b-a803-d9e4b7756c79-kube-api-access-shx6t\") pod \"redhat-marketplace-chbqr\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.894235 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.898851 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5bt4k" Oct 02 18:20:30 crc kubenswrapper[4909]: I1002 18:20:30.903289 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.024393 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f6s2t"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.026856 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.056943 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6s2t"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.192878 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" event={"ID":"0c8d1462-46e9-4190-82f2-224aa63e60d8","Type":"ContainerStarted","Data":"57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491"} Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.192952 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" event={"ID":"0c8d1462-46e9-4190-82f2-224aa63e60d8","Type":"ContainerStarted","Data":"84504889a05a34f987d2d73bd46df225d93acdf61386c3df42f4b226ac055f40"} Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.197533 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.202348 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-utilities\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.202500 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-catalog-content\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.202699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7d9g\" (UniqueName: \"kubernetes.io/projected/a15724ee-af29-4942-8bb4-5aa22a1c03ff-kube-api-access-q7d9g\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.204052 4909 generic.go:334] "Generic (PLEG): container finished" podID="ddb18400-1a15-48bb-a28e-27d19e0dd04c" containerID="3f88b44b3eb5951b72d82d1da8b236f7b30e9bb8c2b8e41fb795bb20f6c60500" exitCode=0 Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.204132 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" event={"ID":"ddb18400-1a15-48bb-a28e-27d19e0dd04c","Type":"ContainerDied","Data":"3f88b44b3eb5951b72d82d1da8b236f7b30e9bb8c2b8e41fb795bb20f6c60500"} Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.214991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" event={"ID":"c23b0961-4854-4dda-a5f6-861fb04604fa","Type":"ContainerStarted","Data":"7fe9d5245c574a65d9853658d0f94f045ea92add5b6c023ad9cebfacf6d216f7"} Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.218472 4909 generic.go:334] "Generic (PLEG): container finished" podID="31e86782-f422-42e5-b517-80cea89f46e3" containerID="22bce0f14558396fb173e8427478335ef0edef82c7101a199f80a85e3e8197cc" exitCode=0 Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.218513 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssdjj" event={"ID":"31e86782-f422-42e5-b517-80cea89f46e3","Type":"ContainerDied","Data":"22bce0f14558396fb173e8427478335ef0edef82c7101a199f80a85e3e8197cc"} Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.267872 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" podStartSLOduration=130.267814402 podStartE2EDuration="2m10.267814402s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:31.230629706 +0000 UTC m=+152.418125575" watchObservedRunningTime="2025-10-02 18:20:31.267814402 +0000 UTC m=+152.455310261" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.303417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-utilities\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.303500 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-catalog-content\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.303534 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7d9g\" (UniqueName: \"kubernetes.io/projected/a15724ee-af29-4942-8bb4-5aa22a1c03ff-kube-api-access-q7d9g\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.304828 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-utilities\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.308332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-catalog-content\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.317103 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chbqr"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.320939 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.321867 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.324337 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.326633 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.328681 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 18:20:31 crc kubenswrapper[4909]: W1002 18:20:31.350912 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50c487d_2e9f_486b_a803_d9e4b7756c79.slice/crio-53951c5c1a4a072b8af072854a3241473c4f768badee0d2c3438613bb7efcd2e WatchSource:0}: Error finding container 53951c5c1a4a072b8af072854a3241473c4f768badee0d2c3438613bb7efcd2e: Status 404 returned error can't find the container with id 53951c5c1a4a072b8af072854a3241473c4f768badee0d2c3438613bb7efcd2e Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.356677 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7d9g\" (UniqueName: \"kubernetes.io/projected/a15724ee-af29-4942-8bb4-5aa22a1c03ff-kube-api-access-q7d9g\") pod \"redhat-marketplace-f6s2t\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.359299 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-qkdhr" podStartSLOduration=12.359283418 podStartE2EDuration="12.359283418s" podCreationTimestamp="2025-10-02 18:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:31.338974233 +0000 UTC m=+152.526470112" watchObservedRunningTime="2025-10-02 18:20:31.359283418 +0000 UTC m=+152.546779277" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.403159 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.422577 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.423988 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.426128 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.427577 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.437466 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.513060 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4800e271-b3fe-4909-8e1e-320a91524d59-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.513116 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4800e271-b3fe-4909-8e1e-320a91524d59-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.587379 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.591888 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:31 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:31 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:31 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.591932 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.614653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4800e271-b3fe-4909-8e1e-320a91524d59-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.614713 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8918800a-645a-4e49-82fa-26ffc951bcf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.614808 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8918800a-645a-4e49-82fa-26ffc951bcf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.614841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4800e271-b3fe-4909-8e1e-320a91524d59-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.614917 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4800e271-b3fe-4909-8e1e-320a91524d59-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.626882 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.636604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4800e271-b3fe-4909-8e1e-320a91524d59-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.655336 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-f4f74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.655407 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f4f74" podUID="aea04f33-66f3-4724-b1cc-e37aa52a23b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.658308 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-f4f74 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.658336 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-f4f74" podUID="aea04f33-66f3-4724-b1cc-e37aa52a23b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.680654 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.680712 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.692103 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.717646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.720791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8918800a-645a-4e49-82fa-26ffc951bcf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.720863 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8918800a-645a-4e49-82fa-26ffc951bcf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.720931 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8918800a-645a-4e49-82fa-26ffc951bcf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.749119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8918800a-645a-4e49-82fa-26ffc951bcf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.750478 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.750571 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.752400 4909 patch_prober.go:28] interesting pod/console-f9d7485db-p4h8w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.752460 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-p4h8w" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.758155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.852761 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6s2t"] Oct 02 18:20:31 crc kubenswrapper[4909]: W1002 18:20:31.866127 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15724ee_af29_4942_8bb4_5aa22a1c03ff.slice/crio-212a6955778c005adb336cce8edf2740a0eec4caa578d7170a48bff0001b981d WatchSource:0}: Error finding container 212a6955778c005adb336cce8edf2740a0eec4caa578d7170a48bff0001b981d: Status 404 returned error can't find the container with id 212a6955778c005adb336cce8edf2740a0eec4caa578d7170a48bff0001b981d Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.984163 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrznm"] Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.985815 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:31 crc kubenswrapper[4909]: I1002 18:20:31.987662 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrznm"] Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:31.992558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.131886 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvlcv\" (UniqueName: \"kubernetes.io/projected/e86f1f08-06b1-4829-aad0-811844bb7b09-kube-api-access-kvlcv\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.132043 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-catalog-content\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.132166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-utilities\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.242663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvlcv\" (UniqueName: \"kubernetes.io/projected/e86f1f08-06b1-4829-aad0-811844bb7b09-kube-api-access-kvlcv\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.242841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-catalog-content\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.242898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-utilities\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.243904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-utilities\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.246187 4909 generic.go:334] "Generic (PLEG): container finished" podID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerID="584f9913e0241d9cd10262d313a52260d20d3b71e7a5c3444996bd3424c8fb99" exitCode=0 Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.246200 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-catalog-content\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.246272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6s2t" event={"ID":"a15724ee-af29-4942-8bb4-5aa22a1c03ff","Type":"ContainerDied","Data":"584f9913e0241d9cd10262d313a52260d20d3b71e7a5c3444996bd3424c8fb99"} Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.246344 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6s2t" event={"ID":"a15724ee-af29-4942-8bb4-5aa22a1c03ff","Type":"ContainerStarted","Data":"212a6955778c005adb336cce8edf2740a0eec4caa578d7170a48bff0001b981d"} Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.252246 4909 generic.go:334] "Generic (PLEG): container finished" podID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerID="38ce820dfd0edec084dd45566d0dcf37d729ebeac2ea60199de8b1ea5c2090b0" exitCode=0 Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.252495 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chbqr" event={"ID":"f50c487d-2e9f-486b-a803-d9e4b7756c79","Type":"ContainerDied","Data":"38ce820dfd0edec084dd45566d0dcf37d729ebeac2ea60199de8b1ea5c2090b0"} Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.252541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chbqr" event={"ID":"f50c487d-2e9f-486b-a803-d9e4b7756c79","Type":"ContainerStarted","Data":"53951c5c1a4a072b8af072854a3241473c4f768badee0d2c3438613bb7efcd2e"} Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.264992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvlcv\" (UniqueName: \"kubernetes.io/projected/e86f1f08-06b1-4829-aad0-811844bb7b09-kube-api-access-kvlcv\") pod \"redhat-operators-xrznm\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.269456 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l89pq" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.348469 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.365226 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.387173 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnsrm"] Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.388849 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.406459 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.414054 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnsrm"] Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.448625 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.559810 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-utilities\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.560251 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbhr\" (UniqueName: \"kubernetes.io/projected/907ed506-3211-4084-90ad-2d018175a073-kube-api-access-dfbhr\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.560275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-catalog-content\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.596603 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:32 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:32 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:32 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.596674 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.667763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-utilities\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.667891 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbhr\" (UniqueName: \"kubernetes.io/projected/907ed506-3211-4084-90ad-2d018175a073-kube-api-access-dfbhr\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.667925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-catalog-content\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.668658 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-catalog-content\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.668968 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-utilities\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.695319 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbhr\" (UniqueName: \"kubernetes.io/projected/907ed506-3211-4084-90ad-2d018175a073-kube-api-access-dfbhr\") pod \"redhat-operators-dnsrm\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.779169 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.781960 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.971908 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9ncb\" (UniqueName: \"kubernetes.io/projected/ddb18400-1a15-48bb-a28e-27d19e0dd04c-kube-api-access-g9ncb\") pod \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.979243 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddb18400-1a15-48bb-a28e-27d19e0dd04c-secret-volume\") pod \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.979563 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb18400-1a15-48bb-a28e-27d19e0dd04c-config-volume\") pod \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\" (UID: \"ddb18400-1a15-48bb-a28e-27d19e0dd04c\") " Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.980336 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb18400-1a15-48bb-a28e-27d19e0dd04c-config-volume" (OuterVolumeSpecName: "config-volume") pod "ddb18400-1a15-48bb-a28e-27d19e0dd04c" (UID: "ddb18400-1a15-48bb-a28e-27d19e0dd04c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.984632 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb18400-1a15-48bb-a28e-27d19e0dd04c-kube-api-access-g9ncb" (OuterVolumeSpecName: "kube-api-access-g9ncb") pod "ddb18400-1a15-48bb-a28e-27d19e0dd04c" (UID: "ddb18400-1a15-48bb-a28e-27d19e0dd04c"). InnerVolumeSpecName "kube-api-access-g9ncb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:20:32 crc kubenswrapper[4909]: I1002 18:20:32.985840 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb18400-1a15-48bb-a28e-27d19e0dd04c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ddb18400-1a15-48bb-a28e-27d19e0dd04c" (UID: "ddb18400-1a15-48bb-a28e-27d19e0dd04c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.015169 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrznm"] Oct 02 18:20:33 crc kubenswrapper[4909]: W1002 18:20:33.055683 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86f1f08_06b1_4829_aad0_811844bb7b09.slice/crio-dbde2ae7a99eaf195b83c84aea163e447f1134762c148d2575802bc1f189f33c WatchSource:0}: Error finding container dbde2ae7a99eaf195b83c84aea163e447f1134762c148d2575802bc1f189f33c: Status 404 returned error can't find the container with id dbde2ae7a99eaf195b83c84aea163e447f1134762c148d2575802bc1f189f33c Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.081057 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb18400-1a15-48bb-a28e-27d19e0dd04c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.081090 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9ncb\" (UniqueName: \"kubernetes.io/projected/ddb18400-1a15-48bb-a28e-27d19e0dd04c-kube-api-access-g9ncb\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.081102 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ddb18400-1a15-48bb-a28e-27d19e0dd04c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.160593 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnsrm"] Oct 02 18:20:33 crc kubenswrapper[4909]: W1002 18:20:33.169196 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod907ed506_3211_4084_90ad_2d018175a073.slice/crio-18c8e4250ea39040dd27693dc1c45d98455197ef1bde44e93ba7760b58404a7f WatchSource:0}: Error finding container 18c8e4250ea39040dd27693dc1c45d98455197ef1bde44e93ba7760b58404a7f: Status 404 returned error can't find the container with id 18c8e4250ea39040dd27693dc1c45d98455197ef1bde44e93ba7760b58404a7f Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.310448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" event={"ID":"ddb18400-1a15-48bb-a28e-27d19e0dd04c","Type":"ContainerDied","Data":"376cf554512ad3aa8c7bb1fbf1d3efca165881fb9271f377e6a37c22d310b668"} Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.310497 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="376cf554512ad3aa8c7bb1fbf1d3efca165881fb9271f377e6a37c22d310b668" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.310525 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.320163 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrznm" event={"ID":"e86f1f08-06b1-4829-aad0-811844bb7b09","Type":"ContainerStarted","Data":"dbde2ae7a99eaf195b83c84aea163e447f1134762c148d2575802bc1f189f33c"} Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.331565 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8918800a-645a-4e49-82fa-26ffc951bcf9","Type":"ContainerStarted","Data":"38806480b85f75aa727fc50a8a4a7bca63bcf6955b1b1614d1aa42ed596984e1"} Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.345147 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4800e271-b3fe-4909-8e1e-320a91524d59","Type":"ContainerStarted","Data":"5469b0d9b3d846748535f9472caa03abc93801c2bc255194373ce0454a06ddff"} Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.348288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4800e271-b3fe-4909-8e1e-320a91524d59","Type":"ContainerStarted","Data":"39173285cf694040efd0bf5448646f03c4d35de8ddb12184a1a42ff363311990"} Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.359071 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.359034452 podStartE2EDuration="2.359034452s" podCreationTimestamp="2025-10-02 18:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:33.348553189 +0000 UTC m=+154.536049048" watchObservedRunningTime="2025-10-02 18:20:33.359034452 +0000 UTC m=+154.546530311" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.359908 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsrm" event={"ID":"907ed506-3211-4084-90ad-2d018175a073","Type":"ContainerStarted","Data":"18c8e4250ea39040dd27693dc1c45d98455197ef1bde44e93ba7760b58404a7f"} Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.405986 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.405966327 podStartE2EDuration="2.405966327s" podCreationTimestamp="2025-10-02 18:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:20:33.379336167 +0000 UTC m=+154.566832026" watchObservedRunningTime="2025-10-02 18:20:33.405966327 +0000 UTC m=+154.593462186" Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.591468 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:33 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:33 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:33 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:33 crc kubenswrapper[4909]: I1002 18:20:33.591556 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.384416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrznm" event={"ID":"e86f1f08-06b1-4829-aad0-811844bb7b09","Type":"ContainerDied","Data":"d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5"} Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.384285 4909 generic.go:334] "Generic (PLEG): container finished" podID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerID="d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5" exitCode=0 Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.393878 4909 generic.go:334] "Generic (PLEG): container finished" podID="8918800a-645a-4e49-82fa-26ffc951bcf9" containerID="8dbfd29d3e882a413fed2d0450c22a6c3c7bf5ad48d58127d702a344866a9735" exitCode=0 Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.393979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8918800a-645a-4e49-82fa-26ffc951bcf9","Type":"ContainerDied","Data":"8dbfd29d3e882a413fed2d0450c22a6c3c7bf5ad48d58127d702a344866a9735"} Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.399104 4909 generic.go:334] "Generic (PLEG): container finished" podID="4800e271-b3fe-4909-8e1e-320a91524d59" containerID="5469b0d9b3d846748535f9472caa03abc93801c2bc255194373ce0454a06ddff" exitCode=0 Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.400864 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4800e271-b3fe-4909-8e1e-320a91524d59","Type":"ContainerDied","Data":"5469b0d9b3d846748535f9472caa03abc93801c2bc255194373ce0454a06ddff"} Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.411373 4909 generic.go:334] "Generic (PLEG): container finished" podID="907ed506-3211-4084-90ad-2d018175a073" containerID="8acfef801f43b9eeba0ba010f39f847c61c0a18f6654ee717f58b0d0828e77a9" exitCode=0 Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.411470 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsrm" event={"ID":"907ed506-3211-4084-90ad-2d018175a073","Type":"ContainerDied","Data":"8acfef801f43b9eeba0ba010f39f847c61c0a18f6654ee717f58b0d0828e77a9"} Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.589744 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:34 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:34 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:34 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:34 crc kubenswrapper[4909]: I1002 18:20:34.589792 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.595469 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:35 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:35 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:35 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.595519 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.826154 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.872636 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4800e271-b3fe-4909-8e1e-320a91524d59-kubelet-dir\") pod \"4800e271-b3fe-4909-8e1e-320a91524d59\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.872762 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4800e271-b3fe-4909-8e1e-320a91524d59-kube-api-access\") pod \"4800e271-b3fe-4909-8e1e-320a91524d59\" (UID: \"4800e271-b3fe-4909-8e1e-320a91524d59\") " Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.878184 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4800e271-b3fe-4909-8e1e-320a91524d59-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4800e271-b3fe-4909-8e1e-320a91524d59" (UID: "4800e271-b3fe-4909-8e1e-320a91524d59"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.901485 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4800e271-b3fe-4909-8e1e-320a91524d59-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4800e271-b3fe-4909-8e1e-320a91524d59" (UID: "4800e271-b3fe-4909-8e1e-320a91524d59"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.958075 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.974419 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4800e271-b3fe-4909-8e1e-320a91524d59-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:35 crc kubenswrapper[4909]: I1002 18:20:35.974453 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4800e271-b3fe-4909-8e1e-320a91524d59-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.075370 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8918800a-645a-4e49-82fa-26ffc951bcf9-kubelet-dir\") pod \"8918800a-645a-4e49-82fa-26ffc951bcf9\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.075726 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8918800a-645a-4e49-82fa-26ffc951bcf9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8918800a-645a-4e49-82fa-26ffc951bcf9" (UID: "8918800a-645a-4e49-82fa-26ffc951bcf9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.075937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8918800a-645a-4e49-82fa-26ffc951bcf9-kube-api-access\") pod \"8918800a-645a-4e49-82fa-26ffc951bcf9\" (UID: \"8918800a-645a-4e49-82fa-26ffc951bcf9\") " Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.077270 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8918800a-645a-4e49-82fa-26ffc951bcf9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.080826 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8918800a-645a-4e49-82fa-26ffc951bcf9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8918800a-645a-4e49-82fa-26ffc951bcf9" (UID: "8918800a-645a-4e49-82fa-26ffc951bcf9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.178658 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8918800a-645a-4e49-82fa-26ffc951bcf9-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.548557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8918800a-645a-4e49-82fa-26ffc951bcf9","Type":"ContainerDied","Data":"38806480b85f75aa727fc50a8a4a7bca63bcf6955b1b1614d1aa42ed596984e1"} Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.548604 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38806480b85f75aa727fc50a8a4a7bca63bcf6955b1b1614d1aa42ed596984e1" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.548690 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.593245 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:36 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:36 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:36 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.593693 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.597525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4800e271-b3fe-4909-8e1e-320a91524d59","Type":"ContainerDied","Data":"39173285cf694040efd0bf5448646f03c4d35de8ddb12184a1a42ff363311990"} Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.597603 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39173285cf694040efd0bf5448646f03c4d35de8ddb12184a1a42ff363311990" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.597724 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 18:20:36 crc kubenswrapper[4909]: I1002 18:20:36.942382 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-htdx9" Oct 02 18:20:37 crc kubenswrapper[4909]: I1002 18:20:37.590192 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:37 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:37 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:37 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:37 crc kubenswrapper[4909]: I1002 18:20:37.590505 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:37 crc kubenswrapper[4909]: I1002 18:20:37.737462 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:20:38 crc kubenswrapper[4909]: I1002 18:20:38.590534 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:38 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:38 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:38 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:38 crc kubenswrapper[4909]: I1002 18:20:38.590623 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:39 crc kubenswrapper[4909]: I1002 18:20:39.593729 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:39 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Oct 02 18:20:39 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:39 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:39 crc kubenswrapper[4909]: I1002 18:20:39.593794 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:40 crc kubenswrapper[4909]: I1002 18:20:40.617584 4909 patch_prober.go:28] interesting pod/router-default-5444994796-gzl4v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 18:20:40 crc kubenswrapper[4909]: [+]has-synced ok Oct 02 18:20:40 crc kubenswrapper[4909]: [+]process-running ok Oct 02 18:20:40 crc kubenswrapper[4909]: healthz check failed Oct 02 18:20:40 crc kubenswrapper[4909]: I1002 18:20:40.617937 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gzl4v" podUID="6e789d01-3b09-4c6c-9931-f1a1bc87ee4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 18:20:41 crc kubenswrapper[4909]: I1002 18:20:41.593644 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:41 crc kubenswrapper[4909]: I1002 18:20:41.621508 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gzl4v" Oct 02 18:20:41 crc kubenswrapper[4909]: I1002 18:20:41.671744 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f4f74" Oct 02 18:20:41 crc kubenswrapper[4909]: I1002 18:20:41.752062 4909 patch_prober.go:28] interesting pod/console-f9d7485db-p4h8w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 02 18:20:41 crc kubenswrapper[4909]: I1002 18:20:41.752207 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-p4h8w" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 02 18:20:43 crc kubenswrapper[4909]: I1002 18:20:43.425220 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:43 crc kubenswrapper[4909]: I1002 18:20:43.432194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccda21fa-5211-460d-b521-fc5c86673b73-metrics-certs\") pod \"network-metrics-daemon-wd57x\" (UID: \"ccda21fa-5211-460d-b521-fc5c86673b73\") " pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:43 crc kubenswrapper[4909]: I1002 18:20:43.655903 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wd57x" Oct 02 18:20:50 crc kubenswrapper[4909]: I1002 18:20:50.150972 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:20:51 crc kubenswrapper[4909]: I1002 18:20:51.756685 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:51 crc kubenswrapper[4909]: I1002 18:20:51.764402 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:20:53 crc kubenswrapper[4909]: I1002 18:20:53.054574 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:20:53 crc kubenswrapper[4909]: I1002 18:20:53.054687 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:21:02 crc kubenswrapper[4909]: I1002 18:21:02.103466 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-56w8v" Oct 02 18:21:06 crc kubenswrapper[4909]: E1002 18:21:06.024259 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 18:21:06 crc kubenswrapper[4909]: E1002 18:21:06.025646 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-shx6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-chbqr_openshift-marketplace(f50c487d-2e9f-486b-a803-d9e4b7756c79): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:06 crc kubenswrapper[4909]: E1002 18:21:06.027637 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-chbqr" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" Oct 02 18:21:07 crc kubenswrapper[4909]: E1002 18:21:07.313770 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-chbqr" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" Oct 02 18:21:07 crc kubenswrapper[4909]: E1002 18:21:07.426263 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 18:21:07 crc kubenswrapper[4909]: E1002 18:21:07.426441 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smspx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bh4ft_openshift-marketplace(d38aa58e-ad87-4447-8b21-a260f54a41bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:07 crc kubenswrapper[4909]: E1002 18:21:07.427710 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bh4ft" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" Oct 02 18:21:07 crc kubenswrapper[4909]: I1002 18:21:07.746580 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 18:21:08 crc kubenswrapper[4909]: E1002 18:21:08.700305 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bh4ft" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" Oct 02 18:21:08 crc kubenswrapper[4909]: E1002 18:21:08.766436 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 18:21:08 crc kubenswrapper[4909]: E1002 18:21:08.766635 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f6v2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ssdjj_openshift-marketplace(31e86782-f422-42e5-b517-80cea89f46e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:08 crc kubenswrapper[4909]: E1002 18:21:08.770520 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ssdjj" podUID="31e86782-f422-42e5-b517-80cea89f46e3" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.535768 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ssdjj" podUID="31e86782-f422-42e5-b517-80cea89f46e3" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.635365 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.635800 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvlcv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xrznm_openshift-marketplace(e86f1f08-06b1-4829-aad0-811844bb7b09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.636925 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.637017 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dfbhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dnsrm_openshift-marketplace(907ed506-3211-4084-90ad-2d018175a073): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.637109 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xrznm" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.638161 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dnsrm" podUID="907ed506-3211-4084-90ad-2d018175a073" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.642598 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.642728 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4f787,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gsjh6_openshift-marketplace(16ab9591-71df-40ac-9815-5bcc4c8e5446): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.644367 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gsjh6" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.671827 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.672037 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7d9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f6s2t_openshift-marketplace(a15724ee-af29-4942-8bb4-5aa22a1c03ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.673266 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f6s2t" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.709823 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.709993 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t422s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gr59x_openshift-marketplace(076f2c01-2196-4903-939a-8cc350611bf7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.711165 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gr59x" podUID="076f2c01-2196-4903-939a-8cc350611bf7" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.868432 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dnsrm" podUID="907ed506-3211-4084-90ad-2d018175a073" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.871657 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gsjh6" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.871674 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f6s2t" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.871738 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gr59x" podUID="076f2c01-2196-4903-939a-8cc350611bf7" Oct 02 18:21:11 crc kubenswrapper[4909]: E1002 18:21:11.871769 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xrznm" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" Oct 02 18:21:11 crc kubenswrapper[4909]: I1002 18:21:11.968605 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wd57x"] Oct 02 18:21:11 crc kubenswrapper[4909]: W1002 18:21:11.990235 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccda21fa_5211_460d_b521_fc5c86673b73.slice/crio-cbd197e029f5f49b5b4aa60687b8cd6debad352ce294a7c240505c308fb97761 WatchSource:0}: Error finding container cbd197e029f5f49b5b4aa60687b8cd6debad352ce294a7c240505c308fb97761: Status 404 returned error can't find the container with id cbd197e029f5f49b5b4aa60687b8cd6debad352ce294a7c240505c308fb97761 Oct 02 18:21:12 crc kubenswrapper[4909]: I1002 18:21:12.875607 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wd57x" event={"ID":"ccda21fa-5211-460d-b521-fc5c86673b73","Type":"ContainerStarted","Data":"e5b6dc3209648110c4e00c1e4c18881a01a216a8df0f0234c31d4abeedc3610c"} Oct 02 18:21:12 crc kubenswrapper[4909]: I1002 18:21:12.876520 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wd57x" event={"ID":"ccda21fa-5211-460d-b521-fc5c86673b73","Type":"ContainerStarted","Data":"7eed229d8db371041dbea6b03ab8a2eed70aa5c7662d020d67479ba674780fda"} Oct 02 18:21:12 crc kubenswrapper[4909]: I1002 18:21:12.876549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wd57x" event={"ID":"ccda21fa-5211-460d-b521-fc5c86673b73","Type":"ContainerStarted","Data":"cbd197e029f5f49b5b4aa60687b8cd6debad352ce294a7c240505c308fb97761"} Oct 02 18:21:12 crc kubenswrapper[4909]: I1002 18:21:12.905364 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wd57x" podStartSLOduration=171.905326916 podStartE2EDuration="2m51.905326916s" podCreationTimestamp="2025-10-02 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:21:12.899475116 +0000 UTC m=+194.086970985" watchObservedRunningTime="2025-10-02 18:21:12.905326916 +0000 UTC m=+194.092822775" Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.054871 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.056171 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.056270 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.057251 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.057439 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5" gracePeriod=600 Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.959248 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5" exitCode=0 Oct 02 18:21:23 crc kubenswrapper[4909]: I1002 18:21:23.959322 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5"} Oct 02 18:21:25 crc kubenswrapper[4909]: I1002 18:21:25.980161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"fa95c4141d0fec98bffe78b803476084fa5d4a01c2fa85e67d5e665f27f9a139"} Oct 02 18:21:25 crc kubenswrapper[4909]: I1002 18:21:25.983662 4909 generic.go:334] "Generic (PLEG): container finished" podID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerID="afb949eae904c11994fb07d956d0e4d826be371b395ac061fca3390bb3be8ce6" exitCode=0 Oct 02 18:21:25 crc kubenswrapper[4909]: I1002 18:21:25.983733 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chbqr" event={"ID":"f50c487d-2e9f-486b-a803-d9e4b7756c79","Type":"ContainerDied","Data":"afb949eae904c11994fb07d956d0e4d826be371b395ac061fca3390bb3be8ce6"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.013897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chbqr" event={"ID":"f50c487d-2e9f-486b-a803-d9e4b7756c79","Type":"ContainerStarted","Data":"eb2e05c54279bd223eac2d4b34b9400c4ccc94c9a8a4a2705b9d80a8bb045cc7"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.021307 4909 generic.go:334] "Generic (PLEG): container finished" podID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerID="841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.021418 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrznm" event={"ID":"e86f1f08-06b1-4829-aad0-811844bb7b09","Type":"ContainerDied","Data":"841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.032315 4909 generic.go:334] "Generic (PLEG): container finished" podID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerID="52a4ff17d38c1255b85200740c453d9e1bc934799bd46b34871df14b2674142f" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.032409 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6s2t" event={"ID":"a15724ee-af29-4942-8bb4-5aa22a1c03ff","Type":"ContainerDied","Data":"52a4ff17d38c1255b85200740c453d9e1bc934799bd46b34871df14b2674142f"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.038735 4909 generic.go:334] "Generic (PLEG): container finished" podID="31e86782-f422-42e5-b517-80cea89f46e3" containerID="3e86af522b97cdda96a21cda45fc662ba5e617b029d15a7595e57ff95f5334fe" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.038809 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssdjj" event={"ID":"31e86782-f422-42e5-b517-80cea89f46e3","Type":"ContainerDied","Data":"3e86af522b97cdda96a21cda45fc662ba5e617b029d15a7595e57ff95f5334fe"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.050166 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-chbqr" podStartSLOduration=3.730807985 podStartE2EDuration="1m0.050139239s" podCreationTimestamp="2025-10-02 18:20:30 +0000 UTC" firstStartedPulling="2025-10-02 18:20:32.26613706 +0000 UTC m=+153.453632919" lastFinishedPulling="2025-10-02 18:21:28.585468314 +0000 UTC m=+209.772964173" observedRunningTime="2025-10-02 18:21:30.046554142 +0000 UTC m=+211.234049991" watchObservedRunningTime="2025-10-02 18:21:30.050139239 +0000 UTC m=+211.237635098" Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.051879 4909 generic.go:334] "Generic (PLEG): container finished" podID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerID="19a383932abe9ee9c4cd82c13ea12404f06dde8875bf05f2cc2039b9d1a36e4f" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.051978 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh4ft" event={"ID":"d38aa58e-ad87-4447-8b21-a260f54a41bc","Type":"ContainerDied","Data":"19a383932abe9ee9c4cd82c13ea12404f06dde8875bf05f2cc2039b9d1a36e4f"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.062952 4909 generic.go:334] "Generic (PLEG): container finished" podID="907ed506-3211-4084-90ad-2d018175a073" containerID="cae4a2447db44c2d6c9d6fd3cce5ea3c9fc5c7ed5ce86f0cc8da1194525492d3" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.063047 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsrm" event={"ID":"907ed506-3211-4084-90ad-2d018175a073","Type":"ContainerDied","Data":"cae4a2447db44c2d6c9d6fd3cce5ea3c9fc5c7ed5ce86f0cc8da1194525492d3"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.069856 4909 generic.go:334] "Generic (PLEG): container finished" podID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerID="d669af689ab6c4f26264a8aeeefdbec0272fb0d6191838b976fc5bf5e000d9c5" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.069891 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gsjh6" event={"ID":"16ab9591-71df-40ac-9815-5bcc4c8e5446","Type":"ContainerDied","Data":"d669af689ab6c4f26264a8aeeefdbec0272fb0d6191838b976fc5bf5e000d9c5"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.073301 4909 generic.go:334] "Generic (PLEG): container finished" podID="076f2c01-2196-4903-939a-8cc350611bf7" containerID="ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568" exitCode=0 Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.073334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr59x" event={"ID":"076f2c01-2196-4903-939a-8cc350611bf7","Type":"ContainerDied","Data":"ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568"} Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.903855 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:21:30 crc kubenswrapper[4909]: I1002 18:21:30.904234 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.092771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr59x" event={"ID":"076f2c01-2196-4903-939a-8cc350611bf7","Type":"ContainerStarted","Data":"a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306"} Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.107101 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrznm" event={"ID":"e86f1f08-06b1-4829-aad0-811844bb7b09","Type":"ContainerStarted","Data":"dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b"} Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.115999 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gr59x" podStartSLOduration=2.658451872 podStartE2EDuration="1m3.115983261s" podCreationTimestamp="2025-10-02 18:20:28 +0000 UTC" firstStartedPulling="2025-10-02 18:20:30.161096875 +0000 UTC m=+151.348592734" lastFinishedPulling="2025-10-02 18:21:30.618628264 +0000 UTC m=+211.806124123" observedRunningTime="2025-10-02 18:21:31.112247199 +0000 UTC m=+212.299743058" watchObservedRunningTime="2025-10-02 18:21:31.115983261 +0000 UTC m=+212.303479120" Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.124845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsrm" event={"ID":"907ed506-3211-4084-90ad-2d018175a073","Type":"ContainerStarted","Data":"91250a30869cc9c54cf45d993cf0a487d2adaef7c8450630df34cc2841ef1cb0"} Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.135405 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.135409 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrznm" podStartSLOduration=3.8727887279999997 podStartE2EDuration="1m0.135396693s" podCreationTimestamp="2025-10-02 18:20:31 +0000 UTC" firstStartedPulling="2025-10-02 18:20:34.388719806 +0000 UTC m=+155.576215665" lastFinishedPulling="2025-10-02 18:21:30.651327771 +0000 UTC m=+211.838823630" observedRunningTime="2025-10-02 18:21:31.132071525 +0000 UTC m=+212.319567384" watchObservedRunningTime="2025-10-02 18:21:31.135396693 +0000 UTC m=+212.322892542" Oct 02 18:21:31 crc kubenswrapper[4909]: I1002 18:21:31.192067 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnsrm" podStartSLOduration=2.8826857439999998 podStartE2EDuration="59.192049491s" podCreationTimestamp="2025-10-02 18:20:32 +0000 UTC" firstStartedPulling="2025-10-02 18:20:34.420708321 +0000 UTC m=+155.608204180" lastFinishedPulling="2025-10-02 18:21:30.730072068 +0000 UTC m=+211.917567927" observedRunningTime="2025-10-02 18:21:31.165592548 +0000 UTC m=+212.353088407" watchObservedRunningTime="2025-10-02 18:21:31.192049491 +0000 UTC m=+212.379545350" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.133813 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssdjj" event={"ID":"31e86782-f422-42e5-b517-80cea89f46e3","Type":"ContainerStarted","Data":"b3c4cb3b324381449fb2cdb7f7e08a08ea0043d69d08d69e91c59208324fc728"} Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.137576 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6s2t" event={"ID":"a15724ee-af29-4942-8bb4-5aa22a1c03ff","Type":"ContainerStarted","Data":"33f38d461a4eddfe55d1e2a076d715d273080f459d502d0fb909f1101bdcf6f5"} Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.140476 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh4ft" event={"ID":"d38aa58e-ad87-4447-8b21-a260f54a41bc","Type":"ContainerStarted","Data":"226cf15778570be3b82bb24cbe3930a83ebf99d58faf91bcf8adc670573c5d35"} Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.142627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gsjh6" event={"ID":"16ab9591-71df-40ac-9815-5bcc4c8e5446","Type":"ContainerStarted","Data":"5b03157131b0ab0da0e2d19ec6d8d9fa7a0ef4e7e06408bdb63009a544378a4e"} Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.170910 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ssdjj" podStartSLOduration=3.524215455 podStartE2EDuration="1m3.170889156s" podCreationTimestamp="2025-10-02 18:20:29 +0000 UTC" firstStartedPulling="2025-10-02 18:20:31.23039032 +0000 UTC m=+152.417886179" lastFinishedPulling="2025-10-02 18:21:30.877064021 +0000 UTC m=+212.064559880" observedRunningTime="2025-10-02 18:21:32.168497249 +0000 UTC m=+213.355993118" watchObservedRunningTime="2025-10-02 18:21:32.170889156 +0000 UTC m=+213.358385015" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.193932 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bh4ft" podStartSLOduration=3.45062879 podStartE2EDuration="1m4.193903597s" podCreationTimestamp="2025-10-02 18:20:28 +0000 UTC" firstStartedPulling="2025-10-02 18:20:30.156414591 +0000 UTC m=+151.343910450" lastFinishedPulling="2025-10-02 18:21:30.899689388 +0000 UTC m=+212.087185257" observedRunningTime="2025-10-02 18:21:32.190812506 +0000 UTC m=+213.378308375" watchObservedRunningTime="2025-10-02 18:21:32.193903597 +0000 UTC m=+213.381399456" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.231067 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gsjh6" podStartSLOduration=3.575396211 podStartE2EDuration="1m4.231009906s" podCreationTimestamp="2025-10-02 18:20:28 +0000 UTC" firstStartedPulling="2025-10-02 18:20:30.159129734 +0000 UTC m=+151.346625603" lastFinishedPulling="2025-10-02 18:21:30.814743429 +0000 UTC m=+212.002239298" observedRunningTime="2025-10-02 18:21:32.229240989 +0000 UTC m=+213.416736848" watchObservedRunningTime="2025-10-02 18:21:32.231009906 +0000 UTC m=+213.418505765" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.366561 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.366626 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.782616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:21:32 crc kubenswrapper[4909]: I1002 18:21:32.783000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:21:33 crc kubenswrapper[4909]: I1002 18:21:33.411189 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrznm" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="registry-server" probeResult="failure" output=< Oct 02 18:21:33 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:21:33 crc kubenswrapper[4909]: > Oct 02 18:21:33 crc kubenswrapper[4909]: I1002 18:21:33.835960 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnsrm" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="registry-server" probeResult="failure" output=< Oct 02 18:21:33 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:21:33 crc kubenswrapper[4909]: > Oct 02 18:21:34 crc kubenswrapper[4909]: I1002 18:21:34.714665 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f6s2t" podStartSLOduration=6.149346483 podStartE2EDuration="1m4.714642116s" podCreationTimestamp="2025-10-02 18:20:30 +0000 UTC" firstStartedPulling="2025-10-02 18:20:32.249611021 +0000 UTC m=+153.437106890" lastFinishedPulling="2025-10-02 18:21:30.814906654 +0000 UTC m=+212.002402523" observedRunningTime="2025-10-02 18:21:32.252757855 +0000 UTC m=+213.440253734" watchObservedRunningTime="2025-10-02 18:21:34.714642116 +0000 UTC m=+215.902137975" Oct 02 18:21:34 crc kubenswrapper[4909]: I1002 18:21:34.715739 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8nk9p"] Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.018605 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.019056 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.061905 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.186372 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.186430 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.229597 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.232852 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.339034 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.340140 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.378890 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.498802 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.498848 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:21:39 crc kubenswrapper[4909]: I1002 18:21:39.547167 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:21:40 crc kubenswrapper[4909]: I1002 18:21:40.235159 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:21:40 crc kubenswrapper[4909]: I1002 18:21:40.235484 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:21:40 crc kubenswrapper[4909]: I1002 18:21:40.241863 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:21:40 crc kubenswrapper[4909]: I1002 18:21:40.963375 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:21:41 crc kubenswrapper[4909]: I1002 18:21:41.404522 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:21:41 crc kubenswrapper[4909]: I1002 18:21:41.405535 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:21:41 crc kubenswrapper[4909]: I1002 18:21:41.463519 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:21:41 crc kubenswrapper[4909]: I1002 18:21:41.650187 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bh4ft"] Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.195779 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bh4ft" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="registry-server" containerID="cri-o://226cf15778570be3b82bb24cbe3930a83ebf99d58faf91bcf8adc670573c5d35" gracePeriod=2 Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.238959 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.252199 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssdjj"] Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.252396 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ssdjj" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="registry-server" containerID="cri-o://b3c4cb3b324381449fb2cdb7f7e08a08ea0043d69d08d69e91c59208324fc728" gracePeriod=2 Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.410584 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.452567 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.829382 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:21:42 crc kubenswrapper[4909]: I1002 18:21:42.873431 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:21:44 crc kubenswrapper[4909]: I1002 18:21:44.048693 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6s2t"] Oct 02 18:21:45 crc kubenswrapper[4909]: I1002 18:21:45.215621 4909 generic.go:334] "Generic (PLEG): container finished" podID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerID="226cf15778570be3b82bb24cbe3930a83ebf99d58faf91bcf8adc670573c5d35" exitCode=0 Oct 02 18:21:45 crc kubenswrapper[4909]: I1002 18:21:45.216428 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f6s2t" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="registry-server" containerID="cri-o://33f38d461a4eddfe55d1e2a076d715d273080f459d502d0fb909f1101bdcf6f5" gracePeriod=2 Oct 02 18:21:45 crc kubenswrapper[4909]: I1002 18:21:45.215732 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh4ft" event={"ID":"d38aa58e-ad87-4447-8b21-a260f54a41bc","Type":"ContainerDied","Data":"226cf15778570be3b82bb24cbe3930a83ebf99d58faf91bcf8adc670573c5d35"} Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.225489 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ssdjj_31e86782-f422-42e5-b517-80cea89f46e3/registry-server/0.log" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.226372 4909 generic.go:334] "Generic (PLEG): container finished" podID="31e86782-f422-42e5-b517-80cea89f46e3" containerID="b3c4cb3b324381449fb2cdb7f7e08a08ea0043d69d08d69e91c59208324fc728" exitCode=137 Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.226428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssdjj" event={"ID":"31e86782-f422-42e5-b517-80cea89f46e3","Type":"ContainerDied","Data":"b3c4cb3b324381449fb2cdb7f7e08a08ea0043d69d08d69e91c59208324fc728"} Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.365315 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.517696 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-catalog-content\") pod \"d38aa58e-ad87-4447-8b21-a260f54a41bc\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.517810 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-utilities\") pod \"d38aa58e-ad87-4447-8b21-a260f54a41bc\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.517847 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smspx\" (UniqueName: \"kubernetes.io/projected/d38aa58e-ad87-4447-8b21-a260f54a41bc-kube-api-access-smspx\") pod \"d38aa58e-ad87-4447-8b21-a260f54a41bc\" (UID: \"d38aa58e-ad87-4447-8b21-a260f54a41bc\") " Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.518937 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-utilities" (OuterVolumeSpecName: "utilities") pod "d38aa58e-ad87-4447-8b21-a260f54a41bc" (UID: "d38aa58e-ad87-4447-8b21-a260f54a41bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.525312 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38aa58e-ad87-4447-8b21-a260f54a41bc-kube-api-access-smspx" (OuterVolumeSpecName: "kube-api-access-smspx") pod "d38aa58e-ad87-4447-8b21-a260f54a41bc" (UID: "d38aa58e-ad87-4447-8b21-a260f54a41bc"). InnerVolumeSpecName "kube-api-access-smspx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.564178 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d38aa58e-ad87-4447-8b21-a260f54a41bc" (UID: "d38aa58e-ad87-4447-8b21-a260f54a41bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.619603 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.619635 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38aa58e-ad87-4447-8b21-a260f54a41bc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:46 crc kubenswrapper[4909]: I1002 18:21:46.619648 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smspx\" (UniqueName: \"kubernetes.io/projected/d38aa58e-ad87-4447-8b21-a260f54a41bc-kube-api-access-smspx\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.054089 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnsrm"] Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.054422 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnsrm" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="registry-server" containerID="cri-o://91250a30869cc9c54cf45d993cf0a487d2adaef7c8450630df34cc2841ef1cb0" gracePeriod=2 Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.234050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh4ft" event={"ID":"d38aa58e-ad87-4447-8b21-a260f54a41bc","Type":"ContainerDied","Data":"2cff69f2c28b4588ae8462732736f791c0d7227fadd4f67aa3e446885577838d"} Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.234123 4909 scope.go:117] "RemoveContainer" containerID="226cf15778570be3b82bb24cbe3930a83ebf99d58faf91bcf8adc670573c5d35" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.234143 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh4ft" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.255244 4909 scope.go:117] "RemoveContainer" containerID="19a383932abe9ee9c4cd82c13ea12404f06dde8875bf05f2cc2039b9d1a36e4f" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.276051 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bh4ft"] Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.276641 4909 scope.go:117] "RemoveContainer" containerID="52e3f305bc26154eda9e2e8fdc317b451a9e2d51d19c41d84bbbbe1a64b0dd9e" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.279005 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bh4ft"] Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.617970 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" path="/var/lib/kubelet/pods/d38aa58e-ad87-4447-8b21-a260f54a41bc/volumes" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.646196 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ssdjj_31e86782-f422-42e5-b517-80cea89f46e3/registry-server/0.log" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.647585 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.736445 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-catalog-content\") pod \"31e86782-f422-42e5-b517-80cea89f46e3\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.736594 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6v2x\" (UniqueName: \"kubernetes.io/projected/31e86782-f422-42e5-b517-80cea89f46e3-kube-api-access-f6v2x\") pod \"31e86782-f422-42e5-b517-80cea89f46e3\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.736687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-utilities\") pod \"31e86782-f422-42e5-b517-80cea89f46e3\" (UID: \"31e86782-f422-42e5-b517-80cea89f46e3\") " Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.737867 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-utilities" (OuterVolumeSpecName: "utilities") pod "31e86782-f422-42e5-b517-80cea89f46e3" (UID: "31e86782-f422-42e5-b517-80cea89f46e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.740377 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e86782-f422-42e5-b517-80cea89f46e3-kube-api-access-f6v2x" (OuterVolumeSpecName: "kube-api-access-f6v2x") pod "31e86782-f422-42e5-b517-80cea89f46e3" (UID: "31e86782-f422-42e5-b517-80cea89f46e3"). InnerVolumeSpecName "kube-api-access-f6v2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.795543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31e86782-f422-42e5-b517-80cea89f46e3" (UID: "31e86782-f422-42e5-b517-80cea89f46e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.838183 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6v2x\" (UniqueName: \"kubernetes.io/projected/31e86782-f422-42e5-b517-80cea89f46e3-kube-api-access-f6v2x\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.838224 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:47 crc kubenswrapper[4909]: I1002 18:21:47.838236 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e86782-f422-42e5-b517-80cea89f46e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.244607 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ssdjj_31e86782-f422-42e5-b517-80cea89f46e3/registry-server/0.log" Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.245383 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssdjj" event={"ID":"31e86782-f422-42e5-b517-80cea89f46e3","Type":"ContainerDied","Data":"dc0cacb83c8903a35402778ee886d9a401f708e64c4e5390cdd88af58ddd37fa"} Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.245449 4909 scope.go:117] "RemoveContainer" containerID="b3c4cb3b324381449fb2cdb7f7e08a08ea0043d69d08d69e91c59208324fc728" Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.245450 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssdjj" Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.248821 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6s2t" event={"ID":"a15724ee-af29-4942-8bb4-5aa22a1c03ff","Type":"ContainerDied","Data":"33f38d461a4eddfe55d1e2a076d715d273080f459d502d0fb909f1101bdcf6f5"} Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.248774 4909 generic.go:334] "Generic (PLEG): container finished" podID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerID="33f38d461a4eddfe55d1e2a076d715d273080f459d502d0fb909f1101bdcf6f5" exitCode=0 Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.264783 4909 scope.go:117] "RemoveContainer" containerID="3e86af522b97cdda96a21cda45fc662ba5e617b029d15a7595e57ff95f5334fe" Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.281849 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssdjj"] Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.284404 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ssdjj"] Oct 02 18:21:48 crc kubenswrapper[4909]: I1002 18:21:48.296097 4909 scope.go:117] "RemoveContainer" containerID="22bce0f14558396fb173e8427478335ef0edef82c7101a199f80a85e3e8197cc" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.474561 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.560507 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-utilities\") pod \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.560615 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7d9g\" (UniqueName: \"kubernetes.io/projected/a15724ee-af29-4942-8bb4-5aa22a1c03ff-kube-api-access-q7d9g\") pod \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.560798 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-catalog-content\") pod \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\" (UID: \"a15724ee-af29-4942-8bb4-5aa22a1c03ff\") " Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.561731 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-utilities" (OuterVolumeSpecName: "utilities") pod "a15724ee-af29-4942-8bb4-5aa22a1c03ff" (UID: "a15724ee-af29-4942-8bb4-5aa22a1c03ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.568871 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15724ee-af29-4942-8bb4-5aa22a1c03ff-kube-api-access-q7d9g" (OuterVolumeSpecName: "kube-api-access-q7d9g") pod "a15724ee-af29-4942-8bb4-5aa22a1c03ff" (UID: "a15724ee-af29-4942-8bb4-5aa22a1c03ff"). InnerVolumeSpecName "kube-api-access-q7d9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.573638 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a15724ee-af29-4942-8bb4-5aa22a1c03ff" (UID: "a15724ee-af29-4942-8bb4-5aa22a1c03ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.615409 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e86782-f422-42e5-b517-80cea89f46e3" path="/var/lib/kubelet/pods/31e86782-f422-42e5-b517-80cea89f46e3/volumes" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.662329 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.662372 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7d9g\" (UniqueName: \"kubernetes.io/projected/a15724ee-af29-4942-8bb4-5aa22a1c03ff-kube-api-access-q7d9g\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:49 crc kubenswrapper[4909]: I1002 18:21:49.662382 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15724ee-af29-4942-8bb4-5aa22a1c03ff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.267919 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6s2t" event={"ID":"a15724ee-af29-4942-8bb4-5aa22a1c03ff","Type":"ContainerDied","Data":"212a6955778c005adb336cce8edf2740a0eec4caa578d7170a48bff0001b981d"} Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.268015 4909 scope.go:117] "RemoveContainer" containerID="33f38d461a4eddfe55d1e2a076d715d273080f459d502d0fb909f1101bdcf6f5" Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.267963 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6s2t" Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.270944 4909 generic.go:334] "Generic (PLEG): container finished" podID="907ed506-3211-4084-90ad-2d018175a073" containerID="91250a30869cc9c54cf45d993cf0a487d2adaef7c8450630df34cc2841ef1cb0" exitCode=0 Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.271003 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsrm" event={"ID":"907ed506-3211-4084-90ad-2d018175a073","Type":"ContainerDied","Data":"91250a30869cc9c54cf45d993cf0a487d2adaef7c8450630df34cc2841ef1cb0"} Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.287545 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6s2t"] Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.292243 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6s2t"] Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.296728 4909 scope.go:117] "RemoveContainer" containerID="52a4ff17d38c1255b85200740c453d9e1bc934799bd46b34871df14b2674142f" Oct 02 18:21:50 crc kubenswrapper[4909]: I1002 18:21:50.315340 4909 scope.go:117] "RemoveContainer" containerID="584f9913e0241d9cd10262d313a52260d20d3b71e7a5c3444996bd3424c8fb99" Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.617361 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" path="/var/lib/kubelet/pods/a15724ee-af29-4942-8bb4-5aa22a1c03ff/volumes" Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.867053 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.992141 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-utilities\") pod \"907ed506-3211-4084-90ad-2d018175a073\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.992349 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-catalog-content\") pod \"907ed506-3211-4084-90ad-2d018175a073\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.992435 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfbhr\" (UniqueName: \"kubernetes.io/projected/907ed506-3211-4084-90ad-2d018175a073-kube-api-access-dfbhr\") pod \"907ed506-3211-4084-90ad-2d018175a073\" (UID: \"907ed506-3211-4084-90ad-2d018175a073\") " Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.993159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-utilities" (OuterVolumeSpecName: "utilities") pod "907ed506-3211-4084-90ad-2d018175a073" (UID: "907ed506-3211-4084-90ad-2d018175a073"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:51 crc kubenswrapper[4909]: I1002 18:21:51.997523 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907ed506-3211-4084-90ad-2d018175a073-kube-api-access-dfbhr" (OuterVolumeSpecName: "kube-api-access-dfbhr") pod "907ed506-3211-4084-90ad-2d018175a073" (UID: "907ed506-3211-4084-90ad-2d018175a073"). InnerVolumeSpecName "kube-api-access-dfbhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.093721 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfbhr\" (UniqueName: \"kubernetes.io/projected/907ed506-3211-4084-90ad-2d018175a073-kube-api-access-dfbhr\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.093769 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.291885 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsrm" event={"ID":"907ed506-3211-4084-90ad-2d018175a073","Type":"ContainerDied","Data":"18c8e4250ea39040dd27693dc1c45d98455197ef1bde44e93ba7760b58404a7f"} Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.291950 4909 scope.go:117] "RemoveContainer" containerID="91250a30869cc9c54cf45d993cf0a487d2adaef7c8450630df34cc2841ef1cb0" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.291986 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsrm" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.311414 4909 scope.go:117] "RemoveContainer" containerID="cae4a2447db44c2d6c9d6fd3cce5ea3c9fc5c7ed5ce86f0cc8da1194525492d3" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.328585 4909 scope.go:117] "RemoveContainer" containerID="8acfef801f43b9eeba0ba010f39f847c61c0a18f6654ee717f58b0d0828e77a9" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.507471 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "907ed506-3211-4084-90ad-2d018175a073" (UID: "907ed506-3211-4084-90ad-2d018175a073"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.600293 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907ed506-3211-4084-90ad-2d018175a073-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.623534 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnsrm"] Oct 02 18:21:52 crc kubenswrapper[4909]: I1002 18:21:52.628959 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnsrm"] Oct 02 18:21:53 crc kubenswrapper[4909]: I1002 18:21:53.706085 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907ed506-3211-4084-90ad-2d018175a073" path="/var/lib/kubelet/pods/907ed506-3211-4084-90ad-2d018175a073/volumes" Oct 02 18:21:59 crc kubenswrapper[4909]: I1002 18:21:59.738965 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" podUID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" containerName="oauth-openshift" containerID="cri-o://39e190b1dd664f6241702990ce8a669e33d1cd83e4b5dded009dd47ab7ff579e" gracePeriod=15 Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.344477 4909 generic.go:334] "Generic (PLEG): container finished" podID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" containerID="39e190b1dd664f6241702990ce8a669e33d1cd83e4b5dded009dd47ab7ff579e" exitCode=0 Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.344634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" event={"ID":"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6","Type":"ContainerDied","Data":"39e190b1dd664f6241702990ce8a669e33d1cd83e4b5dded009dd47ab7ff579e"} Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.777443 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.882873 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7559487fb5-t95sh"] Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883479 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883499 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883513 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8918800a-645a-4e49-82fa-26ffc951bcf9" containerName="pruner" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883522 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8918800a-645a-4e49-82fa-26ffc951bcf9" containerName="pruner" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883532 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb18400-1a15-48bb-a28e-27d19e0dd04c" containerName="collect-profiles" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883542 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb18400-1a15-48bb-a28e-27d19e0dd04c" containerName="collect-profiles" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883551 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883560 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883568 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883578 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883592 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883601 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883614 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883622 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883633 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883641 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883655 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883663 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883674 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883682 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883693 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883701 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="extract-content" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883710 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" containerName="oauth-openshift" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883716 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" containerName="oauth-openshift" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883726 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883733 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="extract-utilities" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883744 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4800e271-b3fe-4909-8e1e-320a91524d59" containerName="pruner" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883755 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4800e271-b3fe-4909-8e1e-320a91524d59" containerName="pruner" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883766 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883775 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: E1002 18:22:00.883788 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883795 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883912 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15724ee-af29-4942-8bb4-5aa22a1c03ff" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883931 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4800e271-b3fe-4909-8e1e-320a91524d59" containerName="pruner" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883942 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38aa58e-ad87-4447-8b21-a260f54a41bc" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883977 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb18400-1a15-48bb-a28e-27d19e0dd04c" containerName="collect-profiles" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.883990 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" containerName="oauth-openshift" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.884001 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="907ed506-3211-4084-90ad-2d018175a073" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.884009 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e86782-f422-42e5-b517-80cea89f46e3" containerName="registry-server" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.884018 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8918800a-645a-4e49-82fa-26ffc951bcf9" containerName="pruner" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.884515 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.900762 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7559487fb5-t95sh"] Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916402 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-service-ca\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916459 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-login\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916477 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-idp-0-file-data\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916500 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-router-certs\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916542 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-policies\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916559 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjq9d\" (UniqueName: \"kubernetes.io/projected/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-kube-api-access-mjq9d\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916611 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-ocp-branding-template\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916646 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-trusted-ca-bundle\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916672 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-error\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916712 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-provider-selection\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916744 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-cliconfig\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916768 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-serving-cert\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916788 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-dir\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.916816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-session\") pod \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\" (UID: \"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6\") " Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.918060 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.918683 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.918912 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.919579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.921506 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.934002 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-kube-api-access-mjq9d" (OuterVolumeSpecName: "kube-api-access-mjq9d") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "kube-api-access-mjq9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.934338 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.934801 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.937193 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.937864 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.939687 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.941499 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.944620 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:00 crc kubenswrapper[4909]: I1002 18:22:00.945814 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" (UID: "2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.018975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp4l6\" (UniqueName: \"kubernetes.io/projected/6d1b58a1-e16b-425d-8107-a811281e725c-kube-api-access-hp4l6\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019106 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019142 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d1b58a1-e16b-425d-8107-a811281e725c-audit-dir\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019198 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-session\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019217 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019409 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-router-certs\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019550 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019649 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-service-ca\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-audit-policies\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019752 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-login\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019787 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-error\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019866 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019887 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019910 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019929 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019952 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019972 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.019989 4909 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020001 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020016 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020238 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020269 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020285 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020301 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.020311 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjq9d\" (UniqueName: \"kubernetes.io/projected/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6-kube-api-access-mjq9d\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-audit-policies\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-login\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122229 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-error\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122279 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp4l6\" (UniqueName: \"kubernetes.io/projected/6d1b58a1-e16b-425d-8107-a811281e725c-kube-api-access-hp4l6\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122314 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122390 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d1b58a1-e16b-425d-8107-a811281e725c-audit-dir\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122441 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-session\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122475 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122526 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122559 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d1b58a1-e16b-425d-8107-a811281e725c-audit-dir\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122596 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-router-certs\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122631 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.122689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-service-ca\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.123227 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-audit-policies\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.123274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.124263 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.124612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-service-ca\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.126486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-error\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.126565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.127671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.127982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-router-certs\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.128572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.128999 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.131536 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-system-session\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.131605 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6d1b58a1-e16b-425d-8107-a811281e725c-v4-0-config-user-template-login\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.150726 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp4l6\" (UniqueName: \"kubernetes.io/projected/6d1b58a1-e16b-425d-8107-a811281e725c-kube-api-access-hp4l6\") pod \"oauth-openshift-7559487fb5-t95sh\" (UID: \"6d1b58a1-e16b-425d-8107-a811281e725c\") " pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.240287 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.359149 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" event={"ID":"2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6","Type":"ContainerDied","Data":"a0dd6bcb7a7282dd345f134db6bf5be919b0c2c531fcfeefface1350a5836fe4"} Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.359806 4909 scope.go:117] "RemoveContainer" containerID="39e190b1dd664f6241702990ce8a669e33d1cd83e4b5dded009dd47ab7ff579e" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.359336 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8nk9p" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.406158 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8nk9p"] Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.409635 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8nk9p"] Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.617159 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6" path="/var/lib/kubelet/pods/2b563e9f-f5eb-4fb4-92c5-3f82e2ad43b6/volumes" Oct 02 18:22:01 crc kubenswrapper[4909]: I1002 18:22:01.700231 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7559487fb5-t95sh"] Oct 02 18:22:01 crc kubenswrapper[4909]: W1002 18:22:01.705074 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1b58a1_e16b_425d_8107_a811281e725c.slice/crio-fffe4726561ec64d85032876b123e3dbb714e9326f24bde992ca269817b9306a WatchSource:0}: Error finding container fffe4726561ec64d85032876b123e3dbb714e9326f24bde992ca269817b9306a: Status 404 returned error can't find the container with id fffe4726561ec64d85032876b123e3dbb714e9326f24bde992ca269817b9306a Oct 02 18:22:02 crc kubenswrapper[4909]: I1002 18:22:02.369578 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" event={"ID":"6d1b58a1-e16b-425d-8107-a811281e725c","Type":"ContainerStarted","Data":"cbfc9ba97daff3b04fc7634b54805a04f5c08a12280e538737f77c4314bca045"} Oct 02 18:22:02 crc kubenswrapper[4909]: I1002 18:22:02.369935 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:02 crc kubenswrapper[4909]: I1002 18:22:02.369959 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" event={"ID":"6d1b58a1-e16b-425d-8107-a811281e725c","Type":"ContainerStarted","Data":"fffe4726561ec64d85032876b123e3dbb714e9326f24bde992ca269817b9306a"} Oct 02 18:22:02 crc kubenswrapper[4909]: I1002 18:22:02.399145 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" podStartSLOduration=28.39912072 podStartE2EDuration="28.39912072s" podCreationTimestamp="2025-10-02 18:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:22:02.39699204 +0000 UTC m=+243.584487909" watchObservedRunningTime="2025-10-02 18:22:02.39912072 +0000 UTC m=+243.586616609" Oct 02 18:22:02 crc kubenswrapper[4909]: I1002 18:22:02.715226 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7559487fb5-t95sh" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.264437 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gr59x"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.265229 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gr59x" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="registry-server" containerID="cri-o://a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306" gracePeriod=30 Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.279467 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gsjh6"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.279772 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gsjh6" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="registry-server" containerID="cri-o://5b03157131b0ab0da0e2d19ec6d8d9fa7a0ef4e7e06408bdb63009a544378a4e" gracePeriod=30 Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.285617 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-645jg"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.297355 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" containerID="cri-o://b9233162b7819c9d16f557083e7a4e390d2134b943e9ac9682e0ded23421ea0f" gracePeriod=30 Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.299643 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chbqr"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.299929 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-chbqr" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="registry-server" containerID="cri-o://eb2e05c54279bd223eac2d4b34b9400c4ccc94c9a8a4a2705b9d80a8bb045cc7" gracePeriod=30 Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.313469 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvd2b"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.314813 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.315478 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrznm"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.315839 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrznm" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="registry-server" containerID="cri-o://dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b" gracePeriod=30 Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.335686 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvd2b"] Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.412572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7e3b138-c134-41ec-a5d6-af2e97914045-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.412928 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e3b138-c134-41ec-a5d6-af2e97914045-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.412961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qfd2\" (UniqueName: \"kubernetes.io/projected/e7e3b138-c134-41ec-a5d6-af2e97914045-kube-api-access-4qfd2\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.449528 4909 generic.go:334] "Generic (PLEG): container finished" podID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerID="eb2e05c54279bd223eac2d4b34b9400c4ccc94c9a8a4a2705b9d80a8bb045cc7" exitCode=0 Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.449610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chbqr" event={"ID":"f50c487d-2e9f-486b-a803-d9e4b7756c79","Type":"ContainerDied","Data":"eb2e05c54279bd223eac2d4b34b9400c4ccc94c9a8a4a2705b9d80a8bb045cc7"} Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.514099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7e3b138-c134-41ec-a5d6-af2e97914045-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.514198 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e3b138-c134-41ec-a5d6-af2e97914045-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.514237 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qfd2\" (UniqueName: \"kubernetes.io/projected/e7e3b138-c134-41ec-a5d6-af2e97914045-kube-api-access-4qfd2\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.516061 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e3b138-c134-41ec-a5d6-af2e97914045-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.522910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7e3b138-c134-41ec-a5d6-af2e97914045-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.530800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qfd2\" (UniqueName: \"kubernetes.io/projected/e7e3b138-c134-41ec-a5d6-af2e97914045-kube-api-access-4qfd2\") pod \"marketplace-operator-79b997595-zvd2b\" (UID: \"e7e3b138-c134-41ec-a5d6-af2e97914045\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:15 crc kubenswrapper[4909]: I1002 18:22:15.640149 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.071452 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvd2b"] Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.261169 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.330420 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-utilities\") pod \"076f2c01-2196-4903-939a-8cc350611bf7\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.330568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-catalog-content\") pod \"076f2c01-2196-4903-939a-8cc350611bf7\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.330715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t422s\" (UniqueName: \"kubernetes.io/projected/076f2c01-2196-4903-939a-8cc350611bf7-kube-api-access-t422s\") pod \"076f2c01-2196-4903-939a-8cc350611bf7\" (UID: \"076f2c01-2196-4903-939a-8cc350611bf7\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.333770 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-utilities" (OuterVolumeSpecName: "utilities") pod "076f2c01-2196-4903-939a-8cc350611bf7" (UID: "076f2c01-2196-4903-939a-8cc350611bf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.338312 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076f2c01-2196-4903-939a-8cc350611bf7-kube-api-access-t422s" (OuterVolumeSpecName: "kube-api-access-t422s") pod "076f2c01-2196-4903-939a-8cc350611bf7" (UID: "076f2c01-2196-4903-939a-8cc350611bf7"). InnerVolumeSpecName "kube-api-access-t422s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.340097 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.409011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "076f2c01-2196-4903-939a-8cc350611bf7" (UID: "076f2c01-2196-4903-939a-8cc350611bf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.434542 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-utilities\") pod \"e86f1f08-06b1-4829-aad0-811844bb7b09\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.434600 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvlcv\" (UniqueName: \"kubernetes.io/projected/e86f1f08-06b1-4829-aad0-811844bb7b09-kube-api-access-kvlcv\") pod \"e86f1f08-06b1-4829-aad0-811844bb7b09\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.434701 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-catalog-content\") pod \"e86f1f08-06b1-4829-aad0-811844bb7b09\" (UID: \"e86f1f08-06b1-4829-aad0-811844bb7b09\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.434967 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.434985 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/076f2c01-2196-4903-939a-8cc350611bf7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.434998 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t422s\" (UniqueName: \"kubernetes.io/projected/076f2c01-2196-4903-939a-8cc350611bf7-kube-api-access-t422s\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.435515 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-utilities" (OuterVolumeSpecName: "utilities") pod "e86f1f08-06b1-4829-aad0-811844bb7b09" (UID: "e86f1f08-06b1-4829-aad0-811844bb7b09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.440468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86f1f08-06b1-4829-aad0-811844bb7b09-kube-api-access-kvlcv" (OuterVolumeSpecName: "kube-api-access-kvlcv") pod "e86f1f08-06b1-4829-aad0-811844bb7b09" (UID: "e86f1f08-06b1-4829-aad0-811844bb7b09"). InnerVolumeSpecName "kube-api-access-kvlcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.460809 4909 generic.go:334] "Generic (PLEG): container finished" podID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerID="dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b" exitCode=0 Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.460884 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrznm" event={"ID":"e86f1f08-06b1-4829-aad0-811844bb7b09","Type":"ContainerDied","Data":"dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.460918 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrznm" event={"ID":"e86f1f08-06b1-4829-aad0-811844bb7b09","Type":"ContainerDied","Data":"dbde2ae7a99eaf195b83c84aea163e447f1134762c148d2575802bc1f189f33c"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.460940 4909 scope.go:117] "RemoveContainer" containerID="dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.461118 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrznm" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.466192 4909 generic.go:334] "Generic (PLEG): container finished" podID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerID="b9233162b7819c9d16f557083e7a4e390d2134b943e9ac9682e0ded23421ea0f" exitCode=0 Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.466255 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" event={"ID":"bc0dec87-de8d-40c7-99ad-3e0f933885e6","Type":"ContainerDied","Data":"b9233162b7819c9d16f557083e7a4e390d2134b943e9ac9682e0ded23421ea0f"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.466372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" event={"ID":"bc0dec87-de8d-40c7-99ad-3e0f933885e6","Type":"ContainerDied","Data":"20fbb6f9d5ca2049a0835ad2dbe3b65242472e42eae97899f3100a141db97167"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.466390 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20fbb6f9d5ca2049a0835ad2dbe3b65242472e42eae97899f3100a141db97167" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.470095 4909 generic.go:334] "Generic (PLEG): container finished" podID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerID="5b03157131b0ab0da0e2d19ec6d8d9fa7a0ef4e7e06408bdb63009a544378a4e" exitCode=0 Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.470139 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gsjh6" event={"ID":"16ab9591-71df-40ac-9815-5bcc4c8e5446","Type":"ContainerDied","Data":"5b03157131b0ab0da0e2d19ec6d8d9fa7a0ef4e7e06408bdb63009a544378a4e"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.470154 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gsjh6" event={"ID":"16ab9591-71df-40ac-9815-5bcc4c8e5446","Type":"ContainerDied","Data":"4386889c3f6d67e3f017e16a9b3c28756d71fdc739df9d755acb9be6a5305312"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.470165 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4386889c3f6d67e3f017e16a9b3c28756d71fdc739df9d755acb9be6a5305312" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.470261 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.474449 4909 generic.go:334] "Generic (PLEG): container finished" podID="076f2c01-2196-4903-939a-8cc350611bf7" containerID="a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306" exitCode=0 Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.474575 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr59x" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.474631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr59x" event={"ID":"076f2c01-2196-4903-939a-8cc350611bf7","Type":"ContainerDied","Data":"a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.474656 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr59x" event={"ID":"076f2c01-2196-4903-939a-8cc350611bf7","Type":"ContainerDied","Data":"da4660d5432a9e554d162cb3ad31de89ce032e5d95311c4cf102f3d896221588"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.477755 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chbqr" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.478434 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chbqr" event={"ID":"f50c487d-2e9f-486b-a803-d9e4b7756c79","Type":"ContainerDied","Data":"53951c5c1a4a072b8af072854a3241473c4f768badee0d2c3438613bb7efcd2e"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.479972 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.480652 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" event={"ID":"e7e3b138-c134-41ec-a5d6-af2e97914045","Type":"ContainerStarted","Data":"74b78b0c37143b5e15ad8c70c7836cdd9c3b2a0866e463fce2ee2b3509e3e12e"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.480692 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" event={"ID":"e7e3b138-c134-41ec-a5d6-af2e97914045","Type":"ContainerStarted","Data":"dd9c52eff15422cd64f77b1825a368a858de6a7cc7ac8eb77830bedc16f5f611"} Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.481689 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.484656 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zvd2b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.485491 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" podUID="e7e3b138-c134-41ec-a5d6-af2e97914045" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.484913 4909 scope.go:117] "RemoveContainer" containerID="841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.487160 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.535531 4909 scope.go:117] "RemoveContainer" containerID="d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.535895 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shx6t\" (UniqueName: \"kubernetes.io/projected/f50c487d-2e9f-486b-a803-d9e4b7756c79-kube-api-access-shx6t\") pod \"f50c487d-2e9f-486b-a803-d9e4b7756c79\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.535989 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-catalog-content\") pod \"f50c487d-2e9f-486b-a803-d9e4b7756c79\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.536036 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-utilities\") pod \"f50c487d-2e9f-486b-a803-d9e4b7756c79\" (UID: \"f50c487d-2e9f-486b-a803-d9e4b7756c79\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.536319 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.536331 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvlcv\" (UniqueName: \"kubernetes.io/projected/e86f1f08-06b1-4829-aad0-811844bb7b09-kube-api-access-kvlcv\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.537018 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-utilities" (OuterVolumeSpecName: "utilities") pod "f50c487d-2e9f-486b-a803-d9e4b7756c79" (UID: "f50c487d-2e9f-486b-a803-d9e4b7756c79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.540977 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50c487d-2e9f-486b-a803-d9e4b7756c79-kube-api-access-shx6t" (OuterVolumeSpecName: "kube-api-access-shx6t") pod "f50c487d-2e9f-486b-a803-d9e4b7756c79" (UID: "f50c487d-2e9f-486b-a803-d9e4b7756c79"). InnerVolumeSpecName "kube-api-access-shx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.560851 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" podStartSLOduration=1.5608263340000001 podStartE2EDuration="1.560826334s" podCreationTimestamp="2025-10-02 18:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:22:16.558323522 +0000 UTC m=+257.745819401" watchObservedRunningTime="2025-10-02 18:22:16.560826334 +0000 UTC m=+257.748322193" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.561823 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f50c487d-2e9f-486b-a803-d9e4b7756c79" (UID: "f50c487d-2e9f-486b-a803-d9e4b7756c79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.587143 4909 scope.go:117] "RemoveContainer" containerID="dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b" Oct 02 18:22:16 crc kubenswrapper[4909]: E1002 18:22:16.589819 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b\": container with ID starting with dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b not found: ID does not exist" containerID="dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.589894 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b"} err="failed to get container status \"dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b\": rpc error: code = NotFound desc = could not find container \"dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b\": container with ID starting with dd4311840e122f24da476069496b45c737619a5bb6c100419cfc77a001e4f08b not found: ID does not exist" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.589933 4909 scope.go:117] "RemoveContainer" containerID="841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf" Oct 02 18:22:16 crc kubenswrapper[4909]: E1002 18:22:16.590583 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf\": container with ID starting with 841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf not found: ID does not exist" containerID="841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.590642 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf"} err="failed to get container status \"841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf\": rpc error: code = NotFound desc = could not find container \"841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf\": container with ID starting with 841315b2f07ffc3e6ce42dd9e2cbd50bb9c1fa42455c69deff412752197266cf not found: ID does not exist" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.590684 4909 scope.go:117] "RemoveContainer" containerID="d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5" Oct 02 18:22:16 crc kubenswrapper[4909]: E1002 18:22:16.591077 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5\": container with ID starting with d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5 not found: ID does not exist" containerID="d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.591116 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5"} err="failed to get container status \"d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5\": rpc error: code = NotFound desc = could not find container \"d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5\": container with ID starting with d91ee7a456f73ec5c27b46bb4eda62a28485e4b5abfcefdb6fda41735af343f5 not found: ID does not exist" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.591141 4909 scope.go:117] "RemoveContainer" containerID="a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.606278 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gr59x"] Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.612352 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e86f1f08-06b1-4829-aad0-811844bb7b09" (UID: "e86f1f08-06b1-4829-aad0-811844bb7b09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.612679 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gr59x"] Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.616499 4909 scope.go:117] "RemoveContainer" containerID="ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.635342 4909 scope.go:117] "RemoveContainer" containerID="5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.636796 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-utilities\") pod \"16ab9591-71df-40ac-9815-5bcc4c8e5446\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.636831 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f787\" (UniqueName: \"kubernetes.io/projected/16ab9591-71df-40ac-9815-5bcc4c8e5446-kube-api-access-4f787\") pod \"16ab9591-71df-40ac-9815-5bcc4c8e5446\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.636865 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85v4f\" (UniqueName: \"kubernetes.io/projected/bc0dec87-de8d-40c7-99ad-3e0f933885e6-kube-api-access-85v4f\") pod \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.636895 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-operator-metrics\") pod \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.636947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-catalog-content\") pod \"16ab9591-71df-40ac-9815-5bcc4c8e5446\" (UID: \"16ab9591-71df-40ac-9815-5bcc4c8e5446\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.637017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-trusted-ca\") pod \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\" (UID: \"bc0dec87-de8d-40c7-99ad-3e0f933885e6\") " Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.637305 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.637326 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50c487d-2e9f-486b-a803-d9e4b7756c79-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.637335 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86f1f08-06b1-4829-aad0-811844bb7b09-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.637344 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shx6t\" (UniqueName: \"kubernetes.io/projected/f50c487d-2e9f-486b-a803-d9e4b7756c79-kube-api-access-shx6t\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.638616 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bc0dec87-de8d-40c7-99ad-3e0f933885e6" (UID: "bc0dec87-de8d-40c7-99ad-3e0f933885e6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.639366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-utilities" (OuterVolumeSpecName: "utilities") pod "16ab9591-71df-40ac-9815-5bcc4c8e5446" (UID: "16ab9591-71df-40ac-9815-5bcc4c8e5446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.642295 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0dec87-de8d-40c7-99ad-3e0f933885e6-kube-api-access-85v4f" (OuterVolumeSpecName: "kube-api-access-85v4f") pod "bc0dec87-de8d-40c7-99ad-3e0f933885e6" (UID: "bc0dec87-de8d-40c7-99ad-3e0f933885e6"). InnerVolumeSpecName "kube-api-access-85v4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.642476 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bc0dec87-de8d-40c7-99ad-3e0f933885e6" (UID: "bc0dec87-de8d-40c7-99ad-3e0f933885e6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.642625 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ab9591-71df-40ac-9815-5bcc4c8e5446-kube-api-access-4f787" (OuterVolumeSpecName: "kube-api-access-4f787") pod "16ab9591-71df-40ac-9815-5bcc4c8e5446" (UID: "16ab9591-71df-40ac-9815-5bcc4c8e5446"). InnerVolumeSpecName "kube-api-access-4f787". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.662002 4909 scope.go:117] "RemoveContainer" containerID="a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306" Oct 02 18:22:16 crc kubenswrapper[4909]: E1002 18:22:16.662770 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306\": container with ID starting with a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306 not found: ID does not exist" containerID="a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.662837 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306"} err="failed to get container status \"a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306\": rpc error: code = NotFound desc = could not find container \"a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306\": container with ID starting with a4ea99d656c13b370e7115eafc1e44345548a21dd8aad74d5d28d62c84ee4306 not found: ID does not exist" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.662881 4909 scope.go:117] "RemoveContainer" containerID="ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568" Oct 02 18:22:16 crc kubenswrapper[4909]: E1002 18:22:16.663260 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568\": container with ID starting with ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568 not found: ID does not exist" containerID="ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.663292 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568"} err="failed to get container status \"ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568\": rpc error: code = NotFound desc = could not find container \"ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568\": container with ID starting with ad2c99e0ab2832fb2a00d432b73f358b46b09f0dd1daadfb4bc8df0bb82f8568 not found: ID does not exist" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.663310 4909 scope.go:117] "RemoveContainer" containerID="5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2" Oct 02 18:22:16 crc kubenswrapper[4909]: E1002 18:22:16.663695 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2\": container with ID starting with 5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2 not found: ID does not exist" containerID="5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.664076 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2"} err="failed to get container status \"5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2\": rpc error: code = NotFound desc = could not find container \"5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2\": container with ID starting with 5d272196f4dd29aaa7ac2cd0b1bbaa4da444d0997655cbd6c0390f475e6d05d2 not found: ID does not exist" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.664098 4909 scope.go:117] "RemoveContainer" containerID="eb2e05c54279bd223eac2d4b34b9400c4ccc94c9a8a4a2705b9d80a8bb045cc7" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.680869 4909 scope.go:117] "RemoveContainer" containerID="afb949eae904c11994fb07d956d0e4d826be371b395ac061fca3390bb3be8ce6" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.704918 4909 scope.go:117] "RemoveContainer" containerID="38ce820dfd0edec084dd45566d0dcf37d729ebeac2ea60199de8b1ea5c2090b0" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.711889 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16ab9591-71df-40ac-9815-5bcc4c8e5446" (UID: "16ab9591-71df-40ac-9815-5bcc4c8e5446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.738481 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.738525 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.738540 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ab9591-71df-40ac-9815-5bcc4c8e5446-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.738550 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f787\" (UniqueName: \"kubernetes.io/projected/16ab9591-71df-40ac-9815-5bcc4c8e5446-kube-api-access-4f787\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.738561 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85v4f\" (UniqueName: \"kubernetes.io/projected/bc0dec87-de8d-40c7-99ad-3e0f933885e6-kube-api-access-85v4f\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.738572 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc0dec87-de8d-40c7-99ad-3e0f933885e6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.789569 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrznm"] Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.793601 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrznm"] Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.810979 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chbqr"] Oct 02 18:22:16 crc kubenswrapper[4909]: I1002 18:22:16.815578 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-chbqr"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278009 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wflc9"] Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278240 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278254 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278264 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278273 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278282 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278288 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278295 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278302 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278310 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278317 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278326 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278333 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278340 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278347 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278359 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278365 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278371 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278376 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278388 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278394 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="extract-utilities" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278403 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278409 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278420 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278428 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: E1002 18:22:17.278438 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278445 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="extract-content" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278537 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278553 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" containerName="marketplace-operator" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278562 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278570 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="076f2c01-2196-4903-939a-8cc350611bf7" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.278582 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" containerName="registry-server" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.279356 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.283041 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.292699 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wflc9"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.345670 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7453f0a3-5410-4768-ac27-88ac0ad93046-catalog-content\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.345713 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7453f0a3-5410-4768-ac27-88ac0ad93046-utilities\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.345751 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgmm8\" (UniqueName: \"kubernetes.io/projected/7453f0a3-5410-4768-ac27-88ac0ad93046-kube-api-access-pgmm8\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.447323 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7453f0a3-5410-4768-ac27-88ac0ad93046-catalog-content\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.447378 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7453f0a3-5410-4768-ac27-88ac0ad93046-utilities\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.447414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgmm8\" (UniqueName: \"kubernetes.io/projected/7453f0a3-5410-4768-ac27-88ac0ad93046-kube-api-access-pgmm8\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.448310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7453f0a3-5410-4768-ac27-88ac0ad93046-catalog-content\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.448315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7453f0a3-5410-4768-ac27-88ac0ad93046-utilities\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.466463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgmm8\" (UniqueName: \"kubernetes.io/projected/7453f0a3-5410-4768-ac27-88ac0ad93046-kube-api-access-pgmm8\") pod \"certified-operators-wflc9\" (UID: \"7453f0a3-5410-4768-ac27-88ac0ad93046\") " pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.490854 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-645jg" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.490879 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gsjh6" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.493324 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zvd2b" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.540200 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-645jg"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.545278 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-645jg"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.551649 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gsjh6"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.556103 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gsjh6"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.608506 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.614731 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076f2c01-2196-4903-939a-8cc350611bf7" path="/var/lib/kubelet/pods/076f2c01-2196-4903-939a-8cc350611bf7/volumes" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.615375 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ab9591-71df-40ac-9815-5bcc4c8e5446" path="/var/lib/kubelet/pods/16ab9591-71df-40ac-9815-5bcc4c8e5446/volumes" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.616255 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0dec87-de8d-40c7-99ad-3e0f933885e6" path="/var/lib/kubelet/pods/bc0dec87-de8d-40c7-99ad-3e0f933885e6/volumes" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.617181 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86f1f08-06b1-4829-aad0-811844bb7b09" path="/var/lib/kubelet/pods/e86f1f08-06b1-4829-aad0-811844bb7b09/volumes" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.617854 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50c487d-2e9f-486b-a803-d9e4b7756c79" path="/var/lib/kubelet/pods/f50c487d-2e9f-486b-a803-d9e4b7756c79/volumes" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.772457 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wflc9"] Oct 02 18:22:17 crc kubenswrapper[4909]: W1002 18:22:17.777298 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7453f0a3_5410_4768_ac27_88ac0ad93046.slice/crio-ba4c73398a3f650cae6440cfe40dc0025b5dffd3eb0eccadd06b9881f725b043 WatchSource:0}: Error finding container ba4c73398a3f650cae6440cfe40dc0025b5dffd3eb0eccadd06b9881f725b043: Status 404 returned error can't find the container with id ba4c73398a3f650cae6440cfe40dc0025b5dffd3eb0eccadd06b9881f725b043 Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.883412 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc2w"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.884682 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.887405 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.894280 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc2w"] Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.954072 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6z5\" (UniqueName: \"kubernetes.io/projected/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-kube-api-access-km6z5\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.954152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-catalog-content\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:17 crc kubenswrapper[4909]: I1002 18:22:17.954184 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-utilities\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.055723 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km6z5\" (UniqueName: \"kubernetes.io/projected/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-kube-api-access-km6z5\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.055798 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-catalog-content\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.055831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-utilities\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.056757 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-utilities\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.056842 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-catalog-content\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.079215 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km6z5\" (UniqueName: \"kubernetes.io/projected/63f1acf8-a7a9-4a20-a6b7-fec8d828619d-kube-api-access-km6z5\") pod \"redhat-marketplace-vdc2w\" (UID: \"63f1acf8-a7a9-4a20-a6b7-fec8d828619d\") " pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.201383 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.406528 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdc2w"] Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.498489 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc2w" event={"ID":"63f1acf8-a7a9-4a20-a6b7-fec8d828619d","Type":"ContainerStarted","Data":"c831901cf51da5b42f0e77345507ba1b19de50f9280622f1b5f3527bae728673"} Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.500949 4909 generic.go:334] "Generic (PLEG): container finished" podID="7453f0a3-5410-4768-ac27-88ac0ad93046" containerID="98e8d9c245a199c5b08f5322468addb6b9ef865b1bff82e26797574f82be4dfe" exitCode=0 Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.501189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wflc9" event={"ID":"7453f0a3-5410-4768-ac27-88ac0ad93046","Type":"ContainerDied","Data":"98e8d9c245a199c5b08f5322468addb6b9ef865b1bff82e26797574f82be4dfe"} Oct 02 18:22:18 crc kubenswrapper[4909]: I1002 18:22:18.501248 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wflc9" event={"ID":"7453f0a3-5410-4768-ac27-88ac0ad93046","Type":"ContainerStarted","Data":"ba4c73398a3f650cae6440cfe40dc0025b5dffd3eb0eccadd06b9881f725b043"} Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.507431 4909 generic.go:334] "Generic (PLEG): container finished" podID="63f1acf8-a7a9-4a20-a6b7-fec8d828619d" containerID="132df045146ec9a2cbc0c8cd32d735db16002b65bf0d02b3449fa3c9ff4f17e8" exitCode=0 Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.507539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc2w" event={"ID":"63f1acf8-a7a9-4a20-a6b7-fec8d828619d","Type":"ContainerDied","Data":"132df045146ec9a2cbc0c8cd32d735db16002b65bf0d02b3449fa3c9ff4f17e8"} Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.683141 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khlvb"] Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.684734 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.688561 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.695097 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khlvb"] Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.781927 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93c2fc-1286-4c32-b8ec-b582a58114b8-utilities\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.781988 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6sn\" (UniqueName: \"kubernetes.io/projected/af93c2fc-1286-4c32-b8ec-b582a58114b8-kube-api-access-5s6sn\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.782086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93c2fc-1286-4c32-b8ec-b582a58114b8-catalog-content\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.883982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93c2fc-1286-4c32-b8ec-b582a58114b8-utilities\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.884084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6sn\" (UniqueName: \"kubernetes.io/projected/af93c2fc-1286-4c32-b8ec-b582a58114b8-kube-api-access-5s6sn\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.884150 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93c2fc-1286-4c32-b8ec-b582a58114b8-catalog-content\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.885382 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93c2fc-1286-4c32-b8ec-b582a58114b8-catalog-content\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.886244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93c2fc-1286-4c32-b8ec-b582a58114b8-utilities\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:19 crc kubenswrapper[4909]: I1002 18:22:19.915615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6sn\" (UniqueName: \"kubernetes.io/projected/af93c2fc-1286-4c32-b8ec-b582a58114b8-kube-api-access-5s6sn\") pod \"redhat-operators-khlvb\" (UID: \"af93c2fc-1286-4c32-b8ec-b582a58114b8\") " pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.016107 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.285117 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nz5sb"] Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.286920 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.290457 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.298199 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nz5sb"] Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.390793 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k558\" (UniqueName: \"kubernetes.io/projected/27111323-170a-4dfd-9620-8062abd30b0f-kube-api-access-7k558\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.390857 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-utilities\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.390959 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-catalog-content\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.428252 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khlvb"] Oct 02 18:22:20 crc kubenswrapper[4909]: W1002 18:22:20.439474 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf93c2fc_1286_4c32_b8ec_b582a58114b8.slice/crio-b6bd9c5c542325a9e5d8de3a3dce84adfb51adbcb24910614079cc69d385b486 WatchSource:0}: Error finding container b6bd9c5c542325a9e5d8de3a3dce84adfb51adbcb24910614079cc69d385b486: Status 404 returned error can't find the container with id b6bd9c5c542325a9e5d8de3a3dce84adfb51adbcb24910614079cc69d385b486 Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.492570 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k558\" (UniqueName: \"kubernetes.io/projected/27111323-170a-4dfd-9620-8062abd30b0f-kube-api-access-7k558\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.492635 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-utilities\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.492679 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-catalog-content\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.493070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-catalog-content\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.493620 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-utilities\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.514297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k558\" (UniqueName: \"kubernetes.io/projected/27111323-170a-4dfd-9620-8062abd30b0f-kube-api-access-7k558\") pod \"community-operators-nz5sb\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.517687 4909 generic.go:334] "Generic (PLEG): container finished" podID="7453f0a3-5410-4768-ac27-88ac0ad93046" containerID="d860c676d3e239332c372f8c55ca0ba6445aef2838836ba0ee02acf0aee941ea" exitCode=0 Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.517776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wflc9" event={"ID":"7453f0a3-5410-4768-ac27-88ac0ad93046","Type":"ContainerDied","Data":"d860c676d3e239332c372f8c55ca0ba6445aef2838836ba0ee02acf0aee941ea"} Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.518901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khlvb" event={"ID":"af93c2fc-1286-4c32-b8ec-b582a58114b8","Type":"ContainerStarted","Data":"b6bd9c5c542325a9e5d8de3a3dce84adfb51adbcb24910614079cc69d385b486"} Oct 02 18:22:20 crc kubenswrapper[4909]: I1002 18:22:20.624390 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.010441 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nz5sb"] Oct 02 18:22:21 crc kubenswrapper[4909]: W1002 18:22:21.018965 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27111323_170a_4dfd_9620_8062abd30b0f.slice/crio-079d3f39699f2723310abdcb043fd36e42cd7797568cac3c8aba229b18b532e4 WatchSource:0}: Error finding container 079d3f39699f2723310abdcb043fd36e42cd7797568cac3c8aba229b18b532e4: Status 404 returned error can't find the container with id 079d3f39699f2723310abdcb043fd36e42cd7797568cac3c8aba229b18b532e4 Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.526763 4909 generic.go:334] "Generic (PLEG): container finished" podID="27111323-170a-4dfd-9620-8062abd30b0f" containerID="c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0" exitCode=0 Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.526874 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nz5sb" event={"ID":"27111323-170a-4dfd-9620-8062abd30b0f","Type":"ContainerDied","Data":"c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0"} Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.527367 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nz5sb" event={"ID":"27111323-170a-4dfd-9620-8062abd30b0f","Type":"ContainerStarted","Data":"079d3f39699f2723310abdcb043fd36e42cd7797568cac3c8aba229b18b532e4"} Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.548396 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khlvb" event={"ID":"af93c2fc-1286-4c32-b8ec-b582a58114b8","Type":"ContainerDied","Data":"8516f98d8199418ca61030ea05f234991db5702c8801e548ae8e4dcc760571e2"} Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.552351 4909 generic.go:334] "Generic (PLEG): container finished" podID="af93c2fc-1286-4c32-b8ec-b582a58114b8" containerID="8516f98d8199418ca61030ea05f234991db5702c8801e548ae8e4dcc760571e2" exitCode=0 Oct 02 18:22:21 crc kubenswrapper[4909]: I1002 18:22:21.566179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc2w" event={"ID":"63f1acf8-a7a9-4a20-a6b7-fec8d828619d","Type":"ContainerStarted","Data":"9f2bbc6f5237737bd7295a5f7cf865f48636a84fbe7b8f7dc069e7cb8aca29d2"} Oct 02 18:22:22 crc kubenswrapper[4909]: I1002 18:22:22.574835 4909 generic.go:334] "Generic (PLEG): container finished" podID="63f1acf8-a7a9-4a20-a6b7-fec8d828619d" containerID="9f2bbc6f5237737bd7295a5f7cf865f48636a84fbe7b8f7dc069e7cb8aca29d2" exitCode=0 Oct 02 18:22:22 crc kubenswrapper[4909]: I1002 18:22:22.574901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc2w" event={"ID":"63f1acf8-a7a9-4a20-a6b7-fec8d828619d","Type":"ContainerDied","Data":"9f2bbc6f5237737bd7295a5f7cf865f48636a84fbe7b8f7dc069e7cb8aca29d2"} Oct 02 18:22:22 crc kubenswrapper[4909]: I1002 18:22:22.579360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wflc9" event={"ID":"7453f0a3-5410-4768-ac27-88ac0ad93046","Type":"ContainerStarted","Data":"74350dfd291cde5a336dbaae5331cfe7738e8172166bd2ada0e1d17cff1f1c3f"} Oct 02 18:22:22 crc kubenswrapper[4909]: I1002 18:22:22.623598 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wflc9" podStartSLOduration=2.626218385 podStartE2EDuration="5.623570557s" podCreationTimestamp="2025-10-02 18:22:17 +0000 UTC" firstStartedPulling="2025-10-02 18:22:18.502489792 +0000 UTC m=+259.689985651" lastFinishedPulling="2025-10-02 18:22:21.499841964 +0000 UTC m=+262.687337823" observedRunningTime="2025-10-02 18:22:22.622116591 +0000 UTC m=+263.809612460" watchObservedRunningTime="2025-10-02 18:22:22.623570557 +0000 UTC m=+263.811066456" Oct 02 18:22:23 crc kubenswrapper[4909]: I1002 18:22:23.587961 4909 generic.go:334] "Generic (PLEG): container finished" podID="27111323-170a-4dfd-9620-8062abd30b0f" containerID="0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c" exitCode=0 Oct 02 18:22:23 crc kubenswrapper[4909]: I1002 18:22:23.588530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nz5sb" event={"ID":"27111323-170a-4dfd-9620-8062abd30b0f","Type":"ContainerDied","Data":"0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c"} Oct 02 18:22:23 crc kubenswrapper[4909]: I1002 18:22:23.592521 4909 generic.go:334] "Generic (PLEG): container finished" podID="af93c2fc-1286-4c32-b8ec-b582a58114b8" containerID="abf73b2fc846e3d8c650e9e571aaabb664214324c7e0f2a93548b44078fa4a81" exitCode=0 Oct 02 18:22:23 crc kubenswrapper[4909]: I1002 18:22:23.592633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khlvb" event={"ID":"af93c2fc-1286-4c32-b8ec-b582a58114b8","Type":"ContainerDied","Data":"abf73b2fc846e3d8c650e9e571aaabb664214324c7e0f2a93548b44078fa4a81"} Oct 02 18:22:25 crc kubenswrapper[4909]: I1002 18:22:25.617857 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nz5sb" event={"ID":"27111323-170a-4dfd-9620-8062abd30b0f","Type":"ContainerStarted","Data":"d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877"} Oct 02 18:22:25 crc kubenswrapper[4909]: I1002 18:22:25.621923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khlvb" event={"ID":"af93c2fc-1286-4c32-b8ec-b582a58114b8","Type":"ContainerStarted","Data":"92b4b3333d46f5eff88f34c8386efc0b9fde5b8a8fa0335e9b7135f690186b61"} Oct 02 18:22:25 crc kubenswrapper[4909]: I1002 18:22:25.630302 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdc2w" event={"ID":"63f1acf8-a7a9-4a20-a6b7-fec8d828619d","Type":"ContainerStarted","Data":"ee50f91bdfc394b6faf479204e922553ccd0210c8eb3564fae71a5dd87327529"} Oct 02 18:22:25 crc kubenswrapper[4909]: I1002 18:22:25.640759 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nz5sb" podStartSLOduration=2.9082006099999997 podStartE2EDuration="5.640739541s" podCreationTimestamp="2025-10-02 18:22:20 +0000 UTC" firstStartedPulling="2025-10-02 18:22:21.52905276 +0000 UTC m=+262.716548619" lastFinishedPulling="2025-10-02 18:22:24.261591691 +0000 UTC m=+265.449087550" observedRunningTime="2025-10-02 18:22:25.638221642 +0000 UTC m=+266.825717521" watchObservedRunningTime="2025-10-02 18:22:25.640739541 +0000 UTC m=+266.828235400" Oct 02 18:22:25 crc kubenswrapper[4909]: I1002 18:22:25.666429 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khlvb" podStartSLOduration=4.068183317 podStartE2EDuration="6.666409126s" podCreationTimestamp="2025-10-02 18:22:19 +0000 UTC" firstStartedPulling="2025-10-02 18:22:21.556536081 +0000 UTC m=+262.744031950" lastFinishedPulling="2025-10-02 18:22:24.15476186 +0000 UTC m=+265.342257759" observedRunningTime="2025-10-02 18:22:25.661847224 +0000 UTC m=+266.849343083" watchObservedRunningTime="2025-10-02 18:22:25.666409126 +0000 UTC m=+266.853904975" Oct 02 18:22:25 crc kubenswrapper[4909]: I1002 18:22:25.688739 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdc2w" podStartSLOduration=4.864783967 podStartE2EDuration="8.688722248s" podCreationTimestamp="2025-10-02 18:22:17 +0000 UTC" firstStartedPulling="2025-10-02 18:22:19.590188451 +0000 UTC m=+260.777684350" lastFinishedPulling="2025-10-02 18:22:23.414126772 +0000 UTC m=+264.601622631" observedRunningTime="2025-10-02 18:22:25.685820108 +0000 UTC m=+266.873315967" watchObservedRunningTime="2025-10-02 18:22:25.688722248 +0000 UTC m=+266.876218107" Oct 02 18:22:27 crc kubenswrapper[4909]: I1002 18:22:27.614845 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:27 crc kubenswrapper[4909]: I1002 18:22:27.615284 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:27 crc kubenswrapper[4909]: I1002 18:22:27.657582 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:27 crc kubenswrapper[4909]: I1002 18:22:27.702707 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wflc9" Oct 02 18:22:28 crc kubenswrapper[4909]: I1002 18:22:28.201921 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:28 crc kubenswrapper[4909]: I1002 18:22:28.201973 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:28 crc kubenswrapper[4909]: I1002 18:22:28.243133 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.016422 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.016777 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.065469 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.625040 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.625143 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.688799 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.696011 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khlvb" Oct 02 18:22:30 crc kubenswrapper[4909]: I1002 18:22:30.763454 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 18:22:38 crc kubenswrapper[4909]: I1002 18:22:38.249406 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdc2w" Oct 02 18:23:53 crc kubenswrapper[4909]: I1002 18:23:53.054569 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:23:53 crc kubenswrapper[4909]: I1002 18:23:53.055260 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:24:23 crc kubenswrapper[4909]: I1002 18:24:23.054651 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:24:23 crc kubenswrapper[4909]: I1002 18:24:23.055616 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.054944 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.057235 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.057318 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.058372 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa95c4141d0fec98bffe78b803476084fa5d4a01c2fa85e67d5e665f27f9a139"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.058486 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://fa95c4141d0fec98bffe78b803476084fa5d4a01c2fa85e67d5e665f27f9a139" gracePeriod=600 Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.617779 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="fa95c4141d0fec98bffe78b803476084fa5d4a01c2fa85e67d5e665f27f9a139" exitCode=0 Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.617878 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"fa95c4141d0fec98bffe78b803476084fa5d4a01c2fa85e67d5e665f27f9a139"} Oct 02 18:24:53 crc kubenswrapper[4909]: I1002 18:24:53.618527 4909 scope.go:117] "RemoveContainer" containerID="d7a3c693ae8c780d5bd4f933826e70805f438995f7f1029e020030bcca5675d5" Oct 02 18:24:54 crc kubenswrapper[4909]: I1002 18:24:54.631795 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"b4c495a863221a111f0536c59b76004ad7cbb71445dd30409ddb1deb091b0858"} Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.875860 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x8k2r"] Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.877509 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.904807 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x8k2r"] Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954109 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-registry-tls\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954219 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg28z\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-kube-api-access-zg28z\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954259 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-bound-sa-token\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954288 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-registry-certificates\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954310 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954357 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.954445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-trusted-ca\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:17 crc kubenswrapper[4909]: I1002 18:25:17.985161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055699 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-trusted-ca\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-registry-tls\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055819 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg28z\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-kube-api-access-zg28z\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055852 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-bound-sa-token\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055883 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055904 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-registry-certificates\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.055942 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.056496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.057521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-registry-certificates\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.058881 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-trusted-ca\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.064108 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-registry-tls\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.064735 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.074897 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg28z\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-kube-api-access-zg28z\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.075834 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86e48a11-99e0-4a10-a6fe-6bf3a359c2c3-bound-sa-token\") pod \"image-registry-66df7c8f76-x8k2r\" (UID: \"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3\") " pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.204084 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.463755 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x8k2r"] Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.802881 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" event={"ID":"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3","Type":"ContainerStarted","Data":"61747f79a89dc264f13c047c22b0526d173b17053fd3e741d5e67572bfd72c4c"} Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.803516 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.803541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" event={"ID":"86e48a11-99e0-4a10-a6fe-6bf3a359c2c3","Type":"ContainerStarted","Data":"10c7c06935fd823b5932e165ea1e71c41cfcf3c5f807c41bd84cf4da9f3b1864"} Oct 02 18:25:18 crc kubenswrapper[4909]: I1002 18:25:18.826954 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" podStartSLOduration=1.82693239 podStartE2EDuration="1.82693239s" podCreationTimestamp="2025-10-02 18:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:25:18.82469366 +0000 UTC m=+440.012189599" watchObservedRunningTime="2025-10-02 18:25:18.82693239 +0000 UTC m=+440.014428249" Oct 02 18:25:38 crc kubenswrapper[4909]: I1002 18:25:38.221364 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-x8k2r" Oct 02 18:25:38 crc kubenswrapper[4909]: I1002 18:25:38.306668 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kpmgk"] Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.360821 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" podUID="0c8d1462-46e9-4190-82f2-224aa63e60d8" containerName="registry" containerID="cri-o://57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491" gracePeriod=30 Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.710935 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hks\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-kube-api-access-85hks\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823509 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-tls\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823537 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-certificates\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823571 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-bound-sa-token\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823598 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c8d1462-46e9-4190-82f2-224aa63e60d8-installation-pull-secrets\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823666 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c8d1462-46e9-4190-82f2-224aa63e60d8-ca-trust-extracted\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-trusted-ca\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.823885 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0c8d1462-46e9-4190-82f2-224aa63e60d8\" (UID: \"0c8d1462-46e9-4190-82f2-224aa63e60d8\") " Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.825121 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.825112 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.831886 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.832402 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c8d1462-46e9-4190-82f2-224aa63e60d8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.832498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.840203 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-kube-api-access-85hks" (OuterVolumeSpecName: "kube-api-access-85hks") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "kube-api-access-85hks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.840371 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.842326 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8d1462-46e9-4190-82f2-224aa63e60d8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0c8d1462-46e9-4190-82f2-224aa63e60d8" (UID: "0c8d1462-46e9-4190-82f2-224aa63e60d8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925566 4909 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c8d1462-46e9-4190-82f2-224aa63e60d8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925602 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925611 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hks\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-kube-api-access-85hks\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925623 4909 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925634 4909 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925643 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c8d1462-46e9-4190-82f2-224aa63e60d8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:03 crc kubenswrapper[4909]: I1002 18:26:03.925653 4909 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c8d1462-46e9-4190-82f2-224aa63e60d8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.101771 4909 generic.go:334] "Generic (PLEG): container finished" podID="0c8d1462-46e9-4190-82f2-224aa63e60d8" containerID="57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491" exitCode=0 Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.101839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" event={"ID":"0c8d1462-46e9-4190-82f2-224aa63e60d8","Type":"ContainerDied","Data":"57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491"} Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.101873 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.101906 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kpmgk" event={"ID":"0c8d1462-46e9-4190-82f2-224aa63e60d8","Type":"ContainerDied","Data":"84504889a05a34f987d2d73bd46df225d93acdf61386c3df42f4b226ac055f40"} Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.101933 4909 scope.go:117] "RemoveContainer" containerID="57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491" Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.134254 4909 scope.go:117] "RemoveContainer" containerID="57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491" Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.135614 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kpmgk"] Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.140889 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kpmgk"] Oct 02 18:26:04 crc kubenswrapper[4909]: E1002 18:26:04.141105 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491\": container with ID starting with 57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491 not found: ID does not exist" containerID="57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491" Oct 02 18:26:04 crc kubenswrapper[4909]: I1002 18:26:04.141145 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491"} err="failed to get container status \"57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491\": rpc error: code = NotFound desc = could not find container \"57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491\": container with ID starting with 57373bc3f48f17bb8aa8e5659a5ae3edc4d319d33f2e2f753bec93550210c491 not found: ID does not exist" Oct 02 18:26:05 crc kubenswrapper[4909]: I1002 18:26:05.618573 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8d1462-46e9-4190-82f2-224aa63e60d8" path="/var/lib/kubelet/pods/0c8d1462-46e9-4190-82f2-224aa63e60d8/volumes" Oct 02 18:26:53 crc kubenswrapper[4909]: I1002 18:26:53.054406 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:26:53 crc kubenswrapper[4909]: I1002 18:26:53.055097 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:26:59 crc kubenswrapper[4909]: I1002 18:26:59.775289 4909 scope.go:117] "RemoveContainer" containerID="bcf255961ce789802b8cc00bbdf9e96f050cd885256beb223974e2f9a1e73058" Oct 02 18:26:59 crc kubenswrapper[4909]: I1002 18:26:59.805907 4909 scope.go:117] "RemoveContainer" containerID="b9233162b7819c9d16f557083e7a4e390d2134b943e9ac9682e0ded23421ea0f" Oct 02 18:27:23 crc kubenswrapper[4909]: I1002 18:27:23.054911 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:27:23 crc kubenswrapper[4909]: I1002 18:27:23.055638 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.054917 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.055628 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.055702 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.056573 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4c495a863221a111f0536c59b76004ad7cbb71445dd30409ddb1deb091b0858"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.056659 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://b4c495a863221a111f0536c59b76004ad7cbb71445dd30409ddb1deb091b0858" gracePeriod=600 Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.860816 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="b4c495a863221a111f0536c59b76004ad7cbb71445dd30409ddb1deb091b0858" exitCode=0 Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.860888 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"b4c495a863221a111f0536c59b76004ad7cbb71445dd30409ddb1deb091b0858"} Oct 02 18:27:53 crc kubenswrapper[4909]: I1002 18:27:53.860941 4909 scope.go:117] "RemoveContainer" containerID="fa95c4141d0fec98bffe78b803476084fa5d4a01c2fa85e67d5e665f27f9a139" Oct 02 18:27:54 crc kubenswrapper[4909]: I1002 18:27:54.870349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"bbfbb601ff2a52b11f89346581838b2c67f48bfb68a316624366c9e5079f4722"} Oct 02 18:27:59 crc kubenswrapper[4909]: I1002 18:27:59.865362 4909 scope.go:117] "RemoveContainer" containerID="5b03157131b0ab0da0e2d19ec6d8d9fa7a0ef4e7e06408bdb63009a544378a4e" Oct 02 18:27:59 crc kubenswrapper[4909]: I1002 18:27:59.890747 4909 scope.go:117] "RemoveContainer" containerID="d669af689ab6c4f26264a8aeeefdbec0272fb0d6191838b976fc5bf5e000d9c5" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.321107 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg"] Oct 02 18:28:20 crc kubenswrapper[4909]: E1002 18:28:20.324448 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8d1462-46e9-4190-82f2-224aa63e60d8" containerName="registry" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.324621 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8d1462-46e9-4190-82f2-224aa63e60d8" containerName="registry" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.324960 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8d1462-46e9-4190-82f2-224aa63e60d8" containerName="registry" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.329896 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.332994 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg"] Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.334175 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.457535 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.457601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.457685 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8vg8\" (UniqueName: \"kubernetes.io/projected/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-kube-api-access-v8vg8\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.559284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.559876 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.560363 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8vg8\" (UniqueName: \"kubernetes.io/projected/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-kube-api-access-v8vg8\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.560424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.560840 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.592082 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8vg8\" (UniqueName: \"kubernetes.io/projected/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-kube-api-access-v8vg8\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.647961 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:20 crc kubenswrapper[4909]: I1002 18:28:20.904438 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg"] Oct 02 18:28:21 crc kubenswrapper[4909]: I1002 18:28:21.067533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" event={"ID":"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8","Type":"ContainerStarted","Data":"32522e4c9dd961c116ca6c58121c0d7c218cc21c0d02a6e484de6a37b5246c58"} Oct 02 18:28:22 crc kubenswrapper[4909]: I1002 18:28:22.075482 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerID="ee5d110206db5e26fca5436206d625dd844f73abe504591f66fdd4dfb925dae6" exitCode=0 Oct 02 18:28:22 crc kubenswrapper[4909]: I1002 18:28:22.075595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" event={"ID":"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8","Type":"ContainerDied","Data":"ee5d110206db5e26fca5436206d625dd844f73abe504591f66fdd4dfb925dae6"} Oct 02 18:28:22 crc kubenswrapper[4909]: I1002 18:28:22.077557 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:28:25 crc kubenswrapper[4909]: I1002 18:28:25.093330 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerID="de5823363f59f98deaf1e5f256a834a6bdd8caf07636f17ab3ec5910a659a2a1" exitCode=0 Oct 02 18:28:25 crc kubenswrapper[4909]: I1002 18:28:25.093400 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" event={"ID":"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8","Type":"ContainerDied","Data":"de5823363f59f98deaf1e5f256a834a6bdd8caf07636f17ab3ec5910a659a2a1"} Oct 02 18:28:26 crc kubenswrapper[4909]: I1002 18:28:26.106634 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerID="e3e27cbe5cd492312e5dfcdc6b2ee5542c0e7de69e37c35c78476175be6c57a6" exitCode=0 Oct 02 18:28:26 crc kubenswrapper[4909]: I1002 18:28:26.106723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" event={"ID":"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8","Type":"ContainerDied","Data":"e3e27cbe5cd492312e5dfcdc6b2ee5542c0e7de69e37c35c78476175be6c57a6"} Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.436504 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.559455 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-bundle\") pod \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.559510 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-util\") pod \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.559581 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8vg8\" (UniqueName: \"kubernetes.io/projected/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-kube-api-access-v8vg8\") pod \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\" (UID: \"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8\") " Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.562920 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-bundle" (OuterVolumeSpecName: "bundle") pod "9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" (UID: "9dc24505-fbd1-404b-b8c8-9b14f01fc1a8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.569175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-kube-api-access-v8vg8" (OuterVolumeSpecName: "kube-api-access-v8vg8") pod "9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" (UID: "9dc24505-fbd1-404b-b8c8-9b14f01fc1a8"). InnerVolumeSpecName "kube-api-access-v8vg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.578081 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-util" (OuterVolumeSpecName: "util") pod "9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" (UID: "9dc24505-fbd1-404b-b8c8-9b14f01fc1a8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.661425 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8vg8\" (UniqueName: \"kubernetes.io/projected/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-kube-api-access-v8vg8\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.661484 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:27 crc kubenswrapper[4909]: I1002 18:28:27.661502 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9dc24505-fbd1-404b-b8c8-9b14f01fc1a8-util\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:28 crc kubenswrapper[4909]: I1002 18:28:28.122123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" event={"ID":"9dc24505-fbd1-404b-b8c8-9b14f01fc1a8","Type":"ContainerDied","Data":"32522e4c9dd961c116ca6c58121c0d7c218cc21c0d02a6e484de6a37b5246c58"} Oct 02 18:28:28 crc kubenswrapper[4909]: I1002 18:28:28.122168 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32522e4c9dd961c116ca6c58121c0d7c218cc21c0d02a6e484de6a37b5246c58" Oct 02 18:28:28 crc kubenswrapper[4909]: I1002 18:28:28.122235 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg" Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.787863 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4scf8"] Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.789891 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-controller" containerID="cri-o://10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.790011 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="northd" containerID="cri-o://3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.790056 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.790143 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-acl-logging" containerID="cri-o://e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.790000 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-node" containerID="cri-o://b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.789952 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="nbdb" containerID="cri-o://dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.789985 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="sbdb" containerID="cri-o://da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083" gracePeriod=30 Oct 02 18:28:31 crc kubenswrapper[4909]: I1002 18:28:31.840724 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" containerID="cri-o://902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee" gracePeriod=30 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.145153 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovnkube-controller/3.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.147704 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovn-acl-logging/0.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148140 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovn-controller/0.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148889 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee" exitCode=0 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148938 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083" exitCode=0 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148951 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543" exitCode=0 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148961 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982" exitCode=0 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148970 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97" exitCode=0 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148967 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149014 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.148980 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd" exitCode=143 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149075 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149089 4909 scope.go:117] "RemoveContainer" containerID="bfeb1dc05b53dc20b677bb3df4d87f7d6b92e90f3e3e7436474c80e3f0b19e87" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149090 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149237 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.149061 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9" exitCode=143 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.152853 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/2.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.153295 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/1.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.153334 4909 generic.go:334] "Generic (PLEG): container finished" podID="c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e" containerID="44f20c62d21719caf766d226880133866d4504c5d7cc78655ca0c3b1dc2b8f95" exitCode=2 Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.153364 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerDied","Data":"44f20c62d21719caf766d226880133866d4504c5d7cc78655ca0c3b1dc2b8f95"} Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.153899 4909 scope.go:117] "RemoveContainer" containerID="44f20c62d21719caf766d226880133866d4504c5d7cc78655ca0c3b1dc2b8f95" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.154165 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7gpnt_openshift-multus(c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e)\"" pod="openshift-multus/multus-7gpnt" podUID="c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.174059 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovn-acl-logging/0.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.174534 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovn-controller/0.log" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.174915 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.190633 4909 scope.go:117] "RemoveContainer" containerID="5a096e99edff2dd141cd410e125d2d0ecc84c0b2437d419044df0cfa5b2cface" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274459 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f2bbt"] Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274644 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274657 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274664 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-node" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274670 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-node" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274680 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="nbdb" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274687 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="nbdb" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274696 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274703 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274710 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274716 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274723 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274729 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274737 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kubecfg-setup" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274743 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kubecfg-setup" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274750 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274756 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274765 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274770 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274779 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="sbdb" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274785 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="sbdb" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274795 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-acl-logging" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274800 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-acl-logging" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274809 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="northd" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274814 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="northd" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274826 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="util" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274831 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="util" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274839 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="pull" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274845 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="pull" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.274855 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="extract" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274863 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="extract" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274955 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274967 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc24505-fbd1-404b-b8c8-9b14f01fc1a8" containerName="extract" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274978 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-acl-logging" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274989 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="northd" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.274997 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275004 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="kube-rbac-proxy-node" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275010 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275017 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovn-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275042 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275052 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="sbdb" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275058 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="nbdb" Oct 02 18:28:32 crc kubenswrapper[4909]: E1002 18:28:32.275142 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275148 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275231 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.275239 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerName="ovnkube-controller" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.276712 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327016 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-env-overrides\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327093 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovn-node-metrics-cert\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327124 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-ovn-kubernetes\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327154 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-config\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327202 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-ovn\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327222 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-slash\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327237 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-netns\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327259 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-bin\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327277 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327344 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-slash" (OuterVolumeSpecName: "host-slash") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327365 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327384 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp72d\" (UniqueName: \"kubernetes.io/projected/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-kube-api-access-qp72d\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327432 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-node-log\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-systemd\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327486 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-var-lib-openvswitch\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327489 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327501 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-log-socket\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327535 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-log-socket" (OuterVolumeSpecName: "log-socket") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327557 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327571 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-script-lib\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327653 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-etc-openvswitch\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327630 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-node-log" (OuterVolumeSpecName: "node-log") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327675 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-systemd-units\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327700 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-kubelet\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327701 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327733 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-openvswitch\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327716 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327721 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327755 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-netd\") pod \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\" (UID: \"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e\") " Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.327869 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328241 4909 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328256 4909 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328265 4909 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328276 4909 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328287 4909 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328298 4909 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328307 4909 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328316 4909 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328324 4909 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328333 4909 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328341 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328351 4909 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328360 4909 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328370 4909 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.328378 4909 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.329218 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.332261 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.337932 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-kube-api-access-qp72d" (OuterVolumeSpecName: "kube-api-access-qp72d") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "kube-api-access-qp72d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.340604 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.354347 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" (UID: "4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430081 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovn-node-metrics-cert\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430150 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-node-log\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430183 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-log-socket\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430336 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430411 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlqcq\" (UniqueName: \"kubernetes.io/projected/4825d3dc-ce2c-4222-88cb-0b81458b9878-kube-api-access-mlqcq\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430468 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-run-netns\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430511 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovnkube-config\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430530 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-cni-netd\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-etc-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovnkube-script-lib\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-cni-bin\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-slash\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430921 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430952 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.430984 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-systemd\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-var-lib-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431072 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-ovn\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431095 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-kubelet\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-env-overrides\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-systemd-units\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431412 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431435 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431449 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp72d\" (UniqueName: \"kubernetes.io/projected/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-kube-api-access-qp72d\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431461 4909 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.431471 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-systemd-units\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovn-node-metrics-cert\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532877 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-node-log\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532901 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-log-socket\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-log-socket\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532956 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-systemd-units\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532997 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.532980 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533060 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlqcq\" (UniqueName: \"kubernetes.io/projected/4825d3dc-ce2c-4222-88cb-0b81458b9878-kube-api-access-mlqcq\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533092 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-run-netns\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533088 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-node-log\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovnkube-config\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533213 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-cni-netd\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-etc-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovnkube-script-lib\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533418 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-cni-bin\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-slash\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533550 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-systemd\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-systemd\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533488 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-cni-netd\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533634 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-var-lib-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533640 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-etc-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533662 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-ovn\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533686 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-kubelet\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533701 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-slash\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovnkube-config\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533721 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-env-overrides\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-cni-bin\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-run-ovn\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-run-netns\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-host-kubelet\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4825d3dc-ce2c-4222-88cb-0b81458b9878-var-lib-openvswitch\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.533947 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovnkube-script-lib\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.534294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4825d3dc-ce2c-4222-88cb-0b81458b9878-env-overrides\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.537530 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4825d3dc-ce2c-4222-88cb-0b81458b9878-ovn-node-metrics-cert\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.555558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlqcq\" (UniqueName: \"kubernetes.io/projected/4825d3dc-ce2c-4222-88cb-0b81458b9878-kube-api-access-mlqcq\") pod \"ovnkube-node-f2bbt\" (UID: \"4825d3dc-ce2c-4222-88cb-0b81458b9878\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: I1002 18:28:32.589747 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:32 crc kubenswrapper[4909]: W1002 18:28:32.610255 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4825d3dc_ce2c_4222_88cb_0b81458b9878.slice/crio-40401b44fed2dfe2dc6655e027218a2bd1046f46337dce1129a4c616e9c9143c WatchSource:0}: Error finding container 40401b44fed2dfe2dc6655e027218a2bd1046f46337dce1129a4c616e9c9143c: Status 404 returned error can't find the container with id 40401b44fed2dfe2dc6655e027218a2bd1046f46337dce1129a4c616e9c9143c Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.160719 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/2.log" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.162672 4909 generic.go:334] "Generic (PLEG): container finished" podID="4825d3dc-ce2c-4222-88cb-0b81458b9878" containerID="91b29c498a9ae846a86eed0a56b2bf0ee657e55a43418be4fb2e42b1507563ad" exitCode=0 Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.162762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerDied","Data":"91b29c498a9ae846a86eed0a56b2bf0ee657e55a43418be4fb2e42b1507563ad"} Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.163205 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"40401b44fed2dfe2dc6655e027218a2bd1046f46337dce1129a4c616e9c9143c"} Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.172373 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovn-acl-logging/0.log" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.173287 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4scf8_4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/ovn-controller/0.log" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.174047 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" containerID="dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05" exitCode=0 Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.174104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05"} Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.174128 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" event={"ID":"4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e","Type":"ContainerDied","Data":"9e3044b5835521e010f1192f76b1a7d6b581f5f57b066d9292ba40c756595779"} Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.174147 4909 scope.go:117] "RemoveContainer" containerID="902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.174313 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4scf8" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.202736 4909 scope.go:117] "RemoveContainer" containerID="da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.236596 4909 scope.go:117] "RemoveContainer" containerID="dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.243081 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4scf8"] Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.252266 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4scf8"] Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.252653 4909 scope.go:117] "RemoveContainer" containerID="3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.287213 4909 scope.go:117] "RemoveContainer" containerID="f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.301461 4909 scope.go:117] "RemoveContainer" containerID="b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.319308 4909 scope.go:117] "RemoveContainer" containerID="e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.339913 4909 scope.go:117] "RemoveContainer" containerID="10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.358862 4909 scope.go:117] "RemoveContainer" containerID="3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.375826 4909 scope.go:117] "RemoveContainer" containerID="902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.376404 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee\": container with ID starting with 902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee not found: ID does not exist" containerID="902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.376460 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee"} err="failed to get container status \"902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee\": rpc error: code = NotFound desc = could not find container \"902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee\": container with ID starting with 902588d55f3184cf81e1b3c939dadb1500ae3af681b9e26bb0137b64e7bd9cee not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.376499 4909 scope.go:117] "RemoveContainer" containerID="da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.376876 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\": container with ID starting with da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083 not found: ID does not exist" containerID="da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.376917 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083"} err="failed to get container status \"da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\": rpc error: code = NotFound desc = could not find container \"da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083\": container with ID starting with da9b21f114e8df828e1ef0139e18b3046ab0e4751fb0fcbd7048cb51f91ce083 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.376948 4909 scope.go:117] "RemoveContainer" containerID="dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.377359 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\": container with ID starting with dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05 not found: ID does not exist" containerID="dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.377381 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05"} err="failed to get container status \"dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\": rpc error: code = NotFound desc = could not find container \"dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05\": container with ID starting with dcc7ecd029f3bd551242a35823f39e645a4d41bb8cf60b3440b443592e44ff05 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.377395 4909 scope.go:117] "RemoveContainer" containerID="3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.377686 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\": container with ID starting with 3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543 not found: ID does not exist" containerID="3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.377720 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543"} err="failed to get container status \"3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\": rpc error: code = NotFound desc = could not find container \"3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543\": container with ID starting with 3bc6679a86d6c84cf82f2300e9dacce7b29a89ab079d012427cbed48d56c3543 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.377741 4909 scope.go:117] "RemoveContainer" containerID="f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.378019 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\": container with ID starting with f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982 not found: ID does not exist" containerID="f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.378056 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982"} err="failed to get container status \"f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\": rpc error: code = NotFound desc = could not find container \"f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982\": container with ID starting with f7de6c278a68bd9aa0bb2674144213bc2c8691f55f85c3fd6ef449a6c7148982 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.378070 4909 scope.go:117] "RemoveContainer" containerID="b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.378390 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\": container with ID starting with b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97 not found: ID does not exist" containerID="b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.378409 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97"} err="failed to get container status \"b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\": rpc error: code = NotFound desc = could not find container \"b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97\": container with ID starting with b1aa949c12b5b366e66f70bb804c82e9adda2fd1ae7bce1690dc339a2a198e97 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.378420 4909 scope.go:117] "RemoveContainer" containerID="e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.378734 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\": container with ID starting with e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd not found: ID does not exist" containerID="e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.378753 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd"} err="failed to get container status \"e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\": rpc error: code = NotFound desc = could not find container \"e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd\": container with ID starting with e9ae99e82849e496cf3cd02eaeaff83a3a2b7df7968c6cee372558635f6ac5fd not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.378766 4909 scope.go:117] "RemoveContainer" containerID="10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.379081 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\": container with ID starting with 10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9 not found: ID does not exist" containerID="10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.379100 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9"} err="failed to get container status \"10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\": rpc error: code = NotFound desc = could not find container \"10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9\": container with ID starting with 10cd56356f189291cb2f810bdd2824d0692c11097081c3a04acf9f59ce4c60d9 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.379111 4909 scope.go:117] "RemoveContainer" containerID="3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081" Oct 02 18:28:33 crc kubenswrapper[4909]: E1002 18:28:33.379419 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\": container with ID starting with 3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081 not found: ID does not exist" containerID="3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.379441 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081"} err="failed to get container status \"3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\": rpc error: code = NotFound desc = could not find container \"3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081\": container with ID starting with 3a558e9dcedf6ab06478d80866201d3205267b53367c82c99212ef6e7bd7b081 not found: ID does not exist" Oct 02 18:28:33 crc kubenswrapper[4909]: I1002 18:28:33.614620 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e" path="/var/lib/kubelet/pods/4e14d2e1-4c48-4e50-bfe3-fd32c5eea95e/volumes" Oct 02 18:28:34 crc kubenswrapper[4909]: I1002 18:28:34.181695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"bbb536b67b872a256716ddf10d152e043ecc361f5453c3f41b27d588ee5723d3"} Oct 02 18:28:34 crc kubenswrapper[4909]: I1002 18:28:34.182221 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"429be6cce413fdcfc2476708bd5c8a422028dccc7ca0633515e59f59d34396e1"} Oct 02 18:28:34 crc kubenswrapper[4909]: I1002 18:28:34.182236 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"0af5db4467f5d90e84064e6a31ff0f32d0d43c82f29b1603ec8957086c40d76a"} Oct 02 18:28:34 crc kubenswrapper[4909]: I1002 18:28:34.182249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"dbf72db2d20415b7332e84b0a15a0380c3c84d66926bf8e9725d89cfb90a4f2e"} Oct 02 18:28:35 crc kubenswrapper[4909]: I1002 18:28:35.192135 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"d825c0466e6e87b21a1acfbb51794ace73ad9dcb634ae7aa9601b545f76e7d22"} Oct 02 18:28:35 crc kubenswrapper[4909]: I1002 18:28:35.192180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"29ce782f2e052a99c1ec319aaf2e7b762cc2170037ea863505874f867136faf3"} Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.207326 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"e8c323eb576142f016fbe31200be966ae0ddb8aa56d51721fea6b999409592ba"} Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.603829 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm"] Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.604720 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.606579 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.606664 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-f9jfk" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.607622 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.720990 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk"] Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.722478 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.724493 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.724991 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-lplpf" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.745639 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb"] Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.746703 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.797477 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfjgk\" (UniqueName: \"kubernetes.io/projected/355b5784-3dc4-4a65-93d8-4bdddd018358-kube-api-access-kfjgk\") pod \"obo-prometheus-operator-7c8cf85677-zhfgm\" (UID: \"355b5784-3dc4-4a65-93d8-4bdddd018358\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.830391 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-mwrxx"] Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.831241 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.833953 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.838744 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-7f7rd" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.898379 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38edcfed-aa44-406d-9028-8395eb3ebb06-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb\" (UID: \"38edcfed-aa44-406d-9028-8395eb3ebb06\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.898443 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38edcfed-aa44-406d-9028-8395eb3ebb06-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb\" (UID: \"38edcfed-aa44-406d-9028-8395eb3ebb06\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.898480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/109ab413-2eeb-42e1-b39f-feddfe589bcc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk\" (UID: \"109ab413-2eeb-42e1-b39f-feddfe589bcc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.898576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/109ab413-2eeb-42e1-b39f-feddfe589bcc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk\" (UID: \"109ab413-2eeb-42e1-b39f-feddfe589bcc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.898648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfjgk\" (UniqueName: \"kubernetes.io/projected/355b5784-3dc4-4a65-93d8-4bdddd018358-kube-api-access-kfjgk\") pod \"obo-prometheus-operator-7c8cf85677-zhfgm\" (UID: \"355b5784-3dc4-4a65-93d8-4bdddd018358\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.919147 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfjgk\" (UniqueName: \"kubernetes.io/projected/355b5784-3dc4-4a65-93d8-4bdddd018358-kube-api-access-kfjgk\") pod \"obo-prometheus-operator-7c8cf85677-zhfgm\" (UID: \"355b5784-3dc4-4a65-93d8-4bdddd018358\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.924739 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: E1002 18:28:37.954714 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(fe74f9cee60fbac115691463c2187b01d4daf356155d6d9538c96496b14b3df5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:37 crc kubenswrapper[4909]: E1002 18:28:37.954821 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(fe74f9cee60fbac115691463c2187b01d4daf356155d6d9538c96496b14b3df5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: E1002 18:28:37.954871 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(fe74f9cee60fbac115691463c2187b01d4daf356155d6d9538c96496b14b3df5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:37 crc kubenswrapper[4909]: E1002 18:28:37.954947 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators(355b5784-3dc4-4a65-93d8-4bdddd018358)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators(355b5784-3dc4-4a65-93d8-4bdddd018358)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(fe74f9cee60fbac115691463c2187b01d4daf356155d6d9538c96496b14b3df5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" podUID="355b5784-3dc4-4a65-93d8-4bdddd018358" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.999710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38edcfed-aa44-406d-9028-8395eb3ebb06-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb\" (UID: \"38edcfed-aa44-406d-9028-8395eb3ebb06\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.999765 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hs8\" (UniqueName: \"kubernetes.io/projected/e4b958c5-5914-4978-884c-e9f42430f52f-kube-api-access-l8hs8\") pod \"observability-operator-cc5f78dfc-mwrxx\" (UID: \"e4b958c5-5914-4978-884c-e9f42430f52f\") " pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.999790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38edcfed-aa44-406d-9028-8395eb3ebb06-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb\" (UID: \"38edcfed-aa44-406d-9028-8395eb3ebb06\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.999820 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/109ab413-2eeb-42e1-b39f-feddfe589bcc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk\" (UID: \"109ab413-2eeb-42e1-b39f-feddfe589bcc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:37 crc kubenswrapper[4909]: I1002 18:28:37.999840 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/109ab413-2eeb-42e1-b39f-feddfe589bcc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk\" (UID: \"109ab413-2eeb-42e1-b39f-feddfe589bcc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.000405 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4b958c5-5914-4978-884c-e9f42430f52f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-mwrxx\" (UID: \"e4b958c5-5914-4978-884c-e9f42430f52f\") " pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.004606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/109ab413-2eeb-42e1-b39f-feddfe589bcc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk\" (UID: \"109ab413-2eeb-42e1-b39f-feddfe589bcc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.004619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/109ab413-2eeb-42e1-b39f-feddfe589bcc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk\" (UID: \"109ab413-2eeb-42e1-b39f-feddfe589bcc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.015278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38edcfed-aa44-406d-9028-8395eb3ebb06-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb\" (UID: \"38edcfed-aa44-406d-9028-8395eb3ebb06\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.017090 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38edcfed-aa44-406d-9028-8395eb3ebb06-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb\" (UID: \"38edcfed-aa44-406d-9028-8395eb3ebb06\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.030001 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-9btfm"] Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.030902 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.039929 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-pxnmq" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.044856 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.063247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.068284 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(cc4eb7c4fc18885c4ff42e977d4e0fb855cc7dfeeccefe3120b11c5c9fc3a378): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.068358 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(cc4eb7c4fc18885c4ff42e977d4e0fb855cc7dfeeccefe3120b11c5c9fc3a378): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.068393 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(cc4eb7c4fc18885c4ff42e977d4e0fb855cc7dfeeccefe3120b11c5c9fc3a378): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.068454 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators(109ab413-2eeb-42e1-b39f-feddfe589bcc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators(109ab413-2eeb-42e1-b39f-feddfe589bcc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(cc4eb7c4fc18885c4ff42e977d4e0fb855cc7dfeeccefe3120b11c5c9fc3a378): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" podUID="109ab413-2eeb-42e1-b39f-feddfe589bcc" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.089105 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b7c5512373e2f1cd9a76d4714648408b924490e49a152f979483e9438f1f9000): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.089186 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b7c5512373e2f1cd9a76d4714648408b924490e49a152f979483e9438f1f9000): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.089208 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b7c5512373e2f1cd9a76d4714648408b924490e49a152f979483e9438f1f9000): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.089259 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators(38edcfed-aa44-406d-9028-8395eb3ebb06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators(38edcfed-aa44-406d-9028-8395eb3ebb06)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b7c5512373e2f1cd9a76d4714648408b924490e49a152f979483e9438f1f9000): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" podUID="38edcfed-aa44-406d-9028-8395eb3ebb06" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.101669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4b958c5-5914-4978-884c-e9f42430f52f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-mwrxx\" (UID: \"e4b958c5-5914-4978-884c-e9f42430f52f\") " pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.101769 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8hs8\" (UniqueName: \"kubernetes.io/projected/e4b958c5-5914-4978-884c-e9f42430f52f-kube-api-access-l8hs8\") pod \"observability-operator-cc5f78dfc-mwrxx\" (UID: \"e4b958c5-5914-4978-884c-e9f42430f52f\") " pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.106173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4b958c5-5914-4978-884c-e9f42430f52f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-mwrxx\" (UID: \"e4b958c5-5914-4978-884c-e9f42430f52f\") " pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.118071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8hs8\" (UniqueName: \"kubernetes.io/projected/e4b958c5-5914-4978-884c-e9f42430f52f-kube-api-access-l8hs8\") pod \"observability-operator-cc5f78dfc-mwrxx\" (UID: \"e4b958c5-5914-4978-884c-e9f42430f52f\") " pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.148523 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.171146 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(37b70234486c22af34967a6376493999f991f10bd047d2281051a617aa6540d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.171210 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(37b70234486c22af34967a6376493999f991f10bd047d2281051a617aa6540d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.171238 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(37b70234486c22af34967a6376493999f991f10bd047d2281051a617aa6540d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.171287 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-mwrxx_openshift-operators(e4b958c5-5914-4978-884c-e9f42430f52f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-mwrxx_openshift-operators(e4b958c5-5914-4978-884c-e9f42430f52f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(37b70234486c22af34967a6376493999f991f10bd047d2281051a617aa6540d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" podUID="e4b958c5-5914-4978-884c-e9f42430f52f" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.202664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/671892a9-6a46-445d-b470-6c16b55b8818-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-9btfm\" (UID: \"671892a9-6a46-445d-b470-6c16b55b8818\") " pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.202788 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqfm\" (UniqueName: \"kubernetes.io/projected/671892a9-6a46-445d-b470-6c16b55b8818-kube-api-access-npqfm\") pod \"perses-operator-54bc95c9fb-9btfm\" (UID: \"671892a9-6a46-445d-b470-6c16b55b8818\") " pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.304451 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npqfm\" (UniqueName: \"kubernetes.io/projected/671892a9-6a46-445d-b470-6c16b55b8818-kube-api-access-npqfm\") pod \"perses-operator-54bc95c9fb-9btfm\" (UID: \"671892a9-6a46-445d-b470-6c16b55b8818\") " pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.304656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/671892a9-6a46-445d-b470-6c16b55b8818-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-9btfm\" (UID: \"671892a9-6a46-445d-b470-6c16b55b8818\") " pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.306546 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/671892a9-6a46-445d-b470-6c16b55b8818-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-9btfm\" (UID: \"671892a9-6a46-445d-b470-6c16b55b8818\") " pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.323246 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqfm\" (UniqueName: \"kubernetes.io/projected/671892a9-6a46-445d-b470-6c16b55b8818-kube-api-access-npqfm\") pod \"perses-operator-54bc95c9fb-9btfm\" (UID: \"671892a9-6a46-445d-b470-6c16b55b8818\") " pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: I1002 18:28:38.346692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.370896 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(a5964a18aab6f821186a43a378e5990ca4773051a48799fa1e93d8d42d795c55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.370967 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(a5964a18aab6f821186a43a378e5990ca4773051a48799fa1e93d8d42d795c55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.370998 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(a5964a18aab6f821186a43a378e5990ca4773051a48799fa1e93d8d42d795c55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:38 crc kubenswrapper[4909]: E1002 18:28:38.371083 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-9btfm_openshift-operators(671892a9-6a46-445d-b470-6c16b55b8818)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-9btfm_openshift-operators(671892a9-6a46-445d-b470-6c16b55b8818)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(a5964a18aab6f821186a43a378e5990ca4773051a48799fa1e93d8d42d795c55): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" podUID="671892a9-6a46-445d-b470-6c16b55b8818" Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.257353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" event={"ID":"4825d3dc-ce2c-4222-88cb-0b81458b9878","Type":"ContainerStarted","Data":"80e8931ee11774c9470109151e4ba663c78b852309bf6f1d0375bd711ea19a92"} Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.257912 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.257964 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.257977 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.298625 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" podStartSLOduration=7.298609136 podStartE2EDuration="7.298609136s" podCreationTimestamp="2025-10-02 18:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:28:39.294679299 +0000 UTC m=+640.482175158" watchObservedRunningTime="2025-10-02 18:28:39.298609136 +0000 UTC m=+640.486104995" Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.331859 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:39 crc kubenswrapper[4909]: I1002 18:28:39.333345 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.002503 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm"] Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.003210 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.003820 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.013152 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-9btfm"] Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.013259 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.013693 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.022018 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk"] Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.022168 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.022735 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.034218 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(32ceadbca36fd9c1471fb631f711bdf93a8a00daf50806cb7930e02ca349984e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.034297 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(32ceadbca36fd9c1471fb631f711bdf93a8a00daf50806cb7930e02ca349984e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.034320 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(32ceadbca36fd9c1471fb631f711bdf93a8a00daf50806cb7930e02ca349984e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.034368 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators(355b5784-3dc4-4a65-93d8-4bdddd018358)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators(355b5784-3dc4-4a65-93d8-4bdddd018358)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(32ceadbca36fd9c1471fb631f711bdf93a8a00daf50806cb7930e02ca349984e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" podUID="355b5784-3dc4-4a65-93d8-4bdddd018358" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.036363 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb"] Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.036539 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.037184 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.062610 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-mwrxx"] Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.062727 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:40 crc kubenswrapper[4909]: I1002 18:28:40.063136 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.073211 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(e570be57a12f98939b554ac9b00a1c0bddfb578f1b1ed43a7839e28415686522): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.073278 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(e570be57a12f98939b554ac9b00a1c0bddfb578f1b1ed43a7839e28415686522): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.073301 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(e570be57a12f98939b554ac9b00a1c0bddfb578f1b1ed43a7839e28415686522): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.073348 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-9btfm_openshift-operators(671892a9-6a46-445d-b470-6c16b55b8818)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-9btfm_openshift-operators(671892a9-6a46-445d-b470-6c16b55b8818)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(e570be57a12f98939b554ac9b00a1c0bddfb578f1b1ed43a7839e28415686522): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" podUID="671892a9-6a46-445d-b470-6c16b55b8818" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.083303 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b5faac4f55bfe3de2d2457d34f293d224345e6844bbda449e3848db0b60837e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.083369 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b5faac4f55bfe3de2d2457d34f293d224345e6844bbda449e3848db0b60837e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.083394 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b5faac4f55bfe3de2d2457d34f293d224345e6844bbda449e3848db0b60837e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.083440 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators(38edcfed-aa44-406d-9028-8395eb3ebb06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators(38edcfed-aa44-406d-9028-8395eb3ebb06)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(b5faac4f55bfe3de2d2457d34f293d224345e6844bbda449e3848db0b60837e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" podUID="38edcfed-aa44-406d-9028-8395eb3ebb06" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.091294 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(bbcfe7b2e54528690b4c43609e28b0691dbac541ec4ba4c263655d22744dfb8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.091341 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(bbcfe7b2e54528690b4c43609e28b0691dbac541ec4ba4c263655d22744dfb8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.091360 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(bbcfe7b2e54528690b4c43609e28b0691dbac541ec4ba4c263655d22744dfb8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.091403 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators(109ab413-2eeb-42e1-b39f-feddfe589bcc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators(109ab413-2eeb-42e1-b39f-feddfe589bcc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(bbcfe7b2e54528690b4c43609e28b0691dbac541ec4ba4c263655d22744dfb8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" podUID="109ab413-2eeb-42e1-b39f-feddfe589bcc" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.116519 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(234c0af2d30556fa9eab2c113c8cc90255b366bf754b4c155a3cfef80e213d84): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.116588 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(234c0af2d30556fa9eab2c113c8cc90255b366bf754b4c155a3cfef80e213d84): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.116612 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(234c0af2d30556fa9eab2c113c8cc90255b366bf754b4c155a3cfef80e213d84): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:40 crc kubenswrapper[4909]: E1002 18:28:40.116655 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-mwrxx_openshift-operators(e4b958c5-5914-4978-884c-e9f42430f52f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-mwrxx_openshift-operators(e4b958c5-5914-4978-884c-e9f42430f52f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(234c0af2d30556fa9eab2c113c8cc90255b366bf754b4c155a3cfef80e213d84): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" podUID="e4b958c5-5914-4978-884c-e9f42430f52f" Oct 02 18:28:45 crc kubenswrapper[4909]: I1002 18:28:45.608287 4909 scope.go:117] "RemoveContainer" containerID="44f20c62d21719caf766d226880133866d4504c5d7cc78655ca0c3b1dc2b8f95" Oct 02 18:28:45 crc kubenswrapper[4909]: E1002 18:28:45.609012 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7gpnt_openshift-multus(c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e)\"" pod="openshift-multus/multus-7gpnt" podUID="c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e" Oct 02 18:28:50 crc kubenswrapper[4909]: I1002 18:28:50.608126 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:50 crc kubenswrapper[4909]: I1002 18:28:50.609201 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:50 crc kubenswrapper[4909]: E1002 18:28:50.679330 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(ea2dcd3e1cdff0486b75a7f90e330afc61c01153b3f9b21485435133ca048142): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:50 crc kubenswrapper[4909]: E1002 18:28:50.679453 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(ea2dcd3e1cdff0486b75a7f90e330afc61c01153b3f9b21485435133ca048142): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:50 crc kubenswrapper[4909]: E1002 18:28:50.679600 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(ea2dcd3e1cdff0486b75a7f90e330afc61c01153b3f9b21485435133ca048142): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:28:50 crc kubenswrapper[4909]: E1002 18:28:50.679689 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-mwrxx_openshift-operators(e4b958c5-5914-4978-884c-e9f42430f52f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-mwrxx_openshift-operators(e4b958c5-5914-4978-884c-e9f42430f52f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-mwrxx_openshift-operators_e4b958c5-5914-4978-884c-e9f42430f52f_0(ea2dcd3e1cdff0486b75a7f90e330afc61c01153b3f9b21485435133ca048142): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" podUID="e4b958c5-5914-4978-884c-e9f42430f52f" Oct 02 18:28:51 crc kubenswrapper[4909]: I1002 18:28:51.611670 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:51 crc kubenswrapper[4909]: I1002 18:28:51.612111 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:51 crc kubenswrapper[4909]: E1002 18:28:51.662410 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(12e81e3a83bd932210d1d5ed3e37f26f9f10e6f78f946cf28eec41b8221a4e16): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:51 crc kubenswrapper[4909]: E1002 18:28:51.662663 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(12e81e3a83bd932210d1d5ed3e37f26f9f10e6f78f946cf28eec41b8221a4e16): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:51 crc kubenswrapper[4909]: E1002 18:28:51.662709 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(12e81e3a83bd932210d1d5ed3e37f26f9f10e6f78f946cf28eec41b8221a4e16): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:28:51 crc kubenswrapper[4909]: E1002 18:28:51.662762 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators(355b5784-3dc4-4a65-93d8-4bdddd018358)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators(355b5784-3dc4-4a65-93d8-4bdddd018358)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-zhfgm_openshift-operators_355b5784-3dc4-4a65-93d8-4bdddd018358_0(12e81e3a83bd932210d1d5ed3e37f26f9f10e6f78f946cf28eec41b8221a4e16): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" podUID="355b5784-3dc4-4a65-93d8-4bdddd018358" Oct 02 18:28:52 crc kubenswrapper[4909]: I1002 18:28:52.607814 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:52 crc kubenswrapper[4909]: I1002 18:28:52.608472 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:52 crc kubenswrapper[4909]: E1002 18:28:52.643128 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(587b1b02d0b5feacef5ec8dad2e244c83ca4bc60e4b64de24c177c2bafe7ac7b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:52 crc kubenswrapper[4909]: E1002 18:28:52.643221 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(587b1b02d0b5feacef5ec8dad2e244c83ca4bc60e4b64de24c177c2bafe7ac7b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:52 crc kubenswrapper[4909]: E1002 18:28:52.643246 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(587b1b02d0b5feacef5ec8dad2e244c83ca4bc60e4b64de24c177c2bafe7ac7b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:28:52 crc kubenswrapper[4909]: E1002 18:28:52.643321 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators(38edcfed-aa44-406d-9028-8395eb3ebb06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators(38edcfed-aa44-406d-9028-8395eb3ebb06)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_openshift-operators_38edcfed-aa44-406d-9028-8395eb3ebb06_0(587b1b02d0b5feacef5ec8dad2e244c83ca4bc60e4b64de24c177c2bafe7ac7b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" podUID="38edcfed-aa44-406d-9028-8395eb3ebb06" Oct 02 18:28:55 crc kubenswrapper[4909]: I1002 18:28:55.608292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:55 crc kubenswrapper[4909]: I1002 18:28:55.608395 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:55 crc kubenswrapper[4909]: I1002 18:28:55.609247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:55 crc kubenswrapper[4909]: I1002 18:28:55.609329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.648247 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(1eb92f47f470b6d63e1d7fe8bfc24a17b37323a168762c12f73299baf6e81e4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.648353 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(1eb92f47f470b6d63e1d7fe8bfc24a17b37323a168762c12f73299baf6e81e4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.648385 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(1eb92f47f470b6d63e1d7fe8bfc24a17b37323a168762c12f73299baf6e81e4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.648451 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-9btfm_openshift-operators(671892a9-6a46-445d-b470-6c16b55b8818)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-9btfm_openshift-operators(671892a9-6a46-445d-b470-6c16b55b8818)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-9btfm_openshift-operators_671892a9-6a46-445d-b470-6c16b55b8818_0(1eb92f47f470b6d63e1d7fe8bfc24a17b37323a168762c12f73299baf6e81e4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" podUID="671892a9-6a46-445d-b470-6c16b55b8818" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.653961 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(abb05f0e4fc072eca9fb1bf009cc07cd48ccb955d8f0cf6c0a65f2c4177e0e86): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.654048 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(abb05f0e4fc072eca9fb1bf009cc07cd48ccb955d8f0cf6c0a65f2c4177e0e86): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.654079 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(abb05f0e4fc072eca9fb1bf009cc07cd48ccb955d8f0cf6c0a65f2c4177e0e86): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:28:55 crc kubenswrapper[4909]: E1002 18:28:55.654146 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators(109ab413-2eeb-42e1-b39f-feddfe589bcc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators(109ab413-2eeb-42e1-b39f-feddfe589bcc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_openshift-operators_109ab413-2eeb-42e1-b39f-feddfe589bcc_0(abb05f0e4fc072eca9fb1bf009cc07cd48ccb955d8f0cf6c0a65f2c4177e0e86): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" podUID="109ab413-2eeb-42e1-b39f-feddfe589bcc" Oct 02 18:28:58 crc kubenswrapper[4909]: I1002 18:28:58.608748 4909 scope.go:117] "RemoveContainer" containerID="44f20c62d21719caf766d226880133866d4504c5d7cc78655ca0c3b1dc2b8f95" Oct 02 18:28:59 crc kubenswrapper[4909]: I1002 18:28:59.375214 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gpnt_c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e/kube-multus/2.log" Oct 02 18:28:59 crc kubenswrapper[4909]: I1002 18:28:59.375727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gpnt" event={"ID":"c726d74b-8f88-4eed-b59e-6dc5f5dcbc0e","Type":"ContainerStarted","Data":"939bde543f65ff900821daa0ae018f984d39e54337f489386ff0165df37272dd"} Oct 02 18:29:02 crc kubenswrapper[4909]: I1002 18:29:02.637088 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2bbt" Oct 02 18:29:03 crc kubenswrapper[4909]: I1002 18:29:03.607950 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:29:03 crc kubenswrapper[4909]: I1002 18:29:03.608288 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:29:03 crc kubenswrapper[4909]: I1002 18:29:03.609410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" Oct 02 18:29:03 crc kubenswrapper[4909]: I1002 18:29:03.609634 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:29:04 crc kubenswrapper[4909]: I1002 18:29:04.100958 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb"] Oct 02 18:29:04 crc kubenswrapper[4909]: I1002 18:29:04.106466 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-mwrxx"] Oct 02 18:29:04 crc kubenswrapper[4909]: W1002 18:29:04.107246 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38edcfed_aa44_406d_9028_8395eb3ebb06.slice/crio-50aac6349c6a249c03efe313330c8d3d572c489b06c26f626be3bfc19e206315 WatchSource:0}: Error finding container 50aac6349c6a249c03efe313330c8d3d572c489b06c26f626be3bfc19e206315: Status 404 returned error can't find the container with id 50aac6349c6a249c03efe313330c8d3d572c489b06c26f626be3bfc19e206315 Oct 02 18:29:04 crc kubenswrapper[4909]: W1002 18:29:04.118272 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b958c5_5914_4978_884c_e9f42430f52f.slice/crio-f2e0256b268d34ff7ae296fd6d3f9e8732b7230ca56503b4b370d882f1949987 WatchSource:0}: Error finding container f2e0256b268d34ff7ae296fd6d3f9e8732b7230ca56503b4b370d882f1949987: Status 404 returned error can't find the container with id f2e0256b268d34ff7ae296fd6d3f9e8732b7230ca56503b4b370d882f1949987 Oct 02 18:29:04 crc kubenswrapper[4909]: I1002 18:29:04.410346 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" event={"ID":"38edcfed-aa44-406d-9028-8395eb3ebb06","Type":"ContainerStarted","Data":"50aac6349c6a249c03efe313330c8d3d572c489b06c26f626be3bfc19e206315"} Oct 02 18:29:04 crc kubenswrapper[4909]: I1002 18:29:04.411307 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" event={"ID":"e4b958c5-5914-4978-884c-e9f42430f52f","Type":"ContainerStarted","Data":"f2e0256b268d34ff7ae296fd6d3f9e8732b7230ca56503b4b370d882f1949987"} Oct 02 18:29:06 crc kubenswrapper[4909]: I1002 18:29:06.607608 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:29:06 crc kubenswrapper[4909]: I1002 18:29:06.608470 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" Oct 02 18:29:06 crc kubenswrapper[4909]: I1002 18:29:06.920193 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm"] Oct 02 18:29:06 crc kubenswrapper[4909]: W1002 18:29:06.933293 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod355b5784_3dc4_4a65_93d8_4bdddd018358.slice/crio-d8976ad41dd3ec5bb303fce95fa07d2cc35725bf829ff871de79ff0fca3e4c8b WatchSource:0}: Error finding container d8976ad41dd3ec5bb303fce95fa07d2cc35725bf829ff871de79ff0fca3e4c8b: Status 404 returned error can't find the container with id d8976ad41dd3ec5bb303fce95fa07d2cc35725bf829ff871de79ff0fca3e4c8b Oct 02 18:29:07 crc kubenswrapper[4909]: I1002 18:29:07.434409 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" event={"ID":"355b5784-3dc4-4a65-93d8-4bdddd018358","Type":"ContainerStarted","Data":"d8976ad41dd3ec5bb303fce95fa07d2cc35725bf829ff871de79ff0fca3e4c8b"} Oct 02 18:29:07 crc kubenswrapper[4909]: I1002 18:29:07.608295 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:29:07 crc kubenswrapper[4909]: I1002 18:29:07.608884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:29:07 crc kubenswrapper[4909]: I1002 18:29:07.846697 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-9btfm"] Oct 02 18:29:08 crc kubenswrapper[4909]: I1002 18:29:08.440675 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" event={"ID":"671892a9-6a46-445d-b470-6c16b55b8818","Type":"ContainerStarted","Data":"3289caaffce49178835d8198867facee0103cf1b0df93e1156b118e9e6930ede"} Oct 02 18:29:08 crc kubenswrapper[4909]: I1002 18:29:08.607958 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:29:08 crc kubenswrapper[4909]: I1002 18:29:08.608698 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" Oct 02 18:29:08 crc kubenswrapper[4909]: I1002 18:29:08.869974 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk"] Oct 02 18:29:08 crc kubenswrapper[4909]: W1002 18:29:08.877334 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109ab413_2eeb_42e1_b39f_feddfe589bcc.slice/crio-afdd71c846b10582195c591576a79568b2de24186266c5159940520b22df92a3 WatchSource:0}: Error finding container afdd71c846b10582195c591576a79568b2de24186266c5159940520b22df92a3: Status 404 returned error can't find the container with id afdd71c846b10582195c591576a79568b2de24186266c5159940520b22df92a3 Oct 02 18:29:09 crc kubenswrapper[4909]: I1002 18:29:09.446901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" event={"ID":"109ab413-2eeb-42e1-b39f-feddfe589bcc","Type":"ContainerStarted","Data":"afdd71c846b10582195c591576a79568b2de24186266c5159940520b22df92a3"} Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.505794 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" event={"ID":"109ab413-2eeb-42e1-b39f-feddfe589bcc","Type":"ContainerStarted","Data":"5d943bb07a61a1c11e3080bebac1debbf9d0fa5b3721bf27d26839419e53b513"} Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.509324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" event={"ID":"671892a9-6a46-445d-b470-6c16b55b8818","Type":"ContainerStarted","Data":"f8e5d869c6ffa49a527ce85826de725d7f6866b0fd85a3c91eb39ce562021c4a"} Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.510463 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.515435 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" event={"ID":"38edcfed-aa44-406d-9028-8395eb3ebb06","Type":"ContainerStarted","Data":"c3f42aea57e1e913b29b57daf8b96a48bead2b684a185e11e3069a087bae8669"} Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.517799 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" event={"ID":"e4b958c5-5914-4978-884c-e9f42430f52f","Type":"ContainerStarted","Data":"8ade1a513f2f9f928031a36c24b9dd16bfc12d7ef290fc989ad9bddf1b980959"} Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.518051 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.519869 4909 patch_prober.go:28] interesting pod/observability-operator-cc5f78dfc-mwrxx container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.45:8081/healthz\": dial tcp 10.217.0.45:8081: connect: connection refused" start-of-body= Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.519926 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" podUID="e4b958c5-5914-4978-884c-e9f42430f52f" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.45:8081/healthz\": dial tcp 10.217.0.45:8081: connect: connection refused" Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.539243 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk" podStartSLOduration=32.253359941 podStartE2EDuration="40.539218398s" podCreationTimestamp="2025-10-02 18:28:37 +0000 UTC" firstStartedPulling="2025-10-02 18:29:08.893518463 +0000 UTC m=+670.081014322" lastFinishedPulling="2025-10-02 18:29:17.17937691 +0000 UTC m=+678.366872779" observedRunningTime="2025-10-02 18:29:17.532747666 +0000 UTC m=+678.720243525" watchObservedRunningTime="2025-10-02 18:29:17.539218398 +0000 UTC m=+678.726714247" Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.592502 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" podStartSLOduration=27.541675922 podStartE2EDuration="40.592485647s" podCreationTimestamp="2025-10-02 18:28:37 +0000 UTC" firstStartedPulling="2025-10-02 18:29:04.123824506 +0000 UTC m=+665.311320365" lastFinishedPulling="2025-10-02 18:29:17.174634221 +0000 UTC m=+678.362130090" observedRunningTime="2025-10-02 18:29:17.58973673 +0000 UTC m=+678.777232589" watchObservedRunningTime="2025-10-02 18:29:17.592485647 +0000 UTC m=+678.779981506" Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.593199 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb" podStartSLOduration=27.534809196 podStartE2EDuration="40.593193598s" podCreationTimestamp="2025-10-02 18:28:37 +0000 UTC" firstStartedPulling="2025-10-02 18:29:04.114991929 +0000 UTC m=+665.302487788" lastFinishedPulling="2025-10-02 18:29:17.173376301 +0000 UTC m=+678.360872190" observedRunningTime="2025-10-02 18:29:17.562212038 +0000 UTC m=+678.749707897" watchObservedRunningTime="2025-10-02 18:29:17.593193598 +0000 UTC m=+678.780689457" Oct 02 18:29:17 crc kubenswrapper[4909]: I1002 18:29:17.622654 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" podStartSLOduration=30.31768473 podStartE2EDuration="39.62263441s" podCreationTimestamp="2025-10-02 18:28:38 +0000 UTC" firstStartedPulling="2025-10-02 18:29:07.869551967 +0000 UTC m=+669.057047826" lastFinishedPulling="2025-10-02 18:29:17.174501647 +0000 UTC m=+678.361997506" observedRunningTime="2025-10-02 18:29:17.619210433 +0000 UTC m=+678.806706292" watchObservedRunningTime="2025-10-02 18:29:17.62263441 +0000 UTC m=+678.810130259" Oct 02 18:29:18 crc kubenswrapper[4909]: I1002 18:29:18.188815 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-mwrxx" Oct 02 18:29:18 crc kubenswrapper[4909]: I1002 18:29:18.530596 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" event={"ID":"355b5784-3dc4-4a65-93d8-4bdddd018358","Type":"ContainerStarted","Data":"3359cc96465806a0074258cd8f5a97b5235a10305da1e092ccd15c9ca37e8b06"} Oct 02 18:29:18 crc kubenswrapper[4909]: I1002 18:29:18.553462 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-zhfgm" podStartSLOduration=31.314491811 podStartE2EDuration="41.553436089s" podCreationTimestamp="2025-10-02 18:28:37 +0000 UTC" firstStartedPulling="2025-10-02 18:29:06.935149266 +0000 UTC m=+668.122645125" lastFinishedPulling="2025-10-02 18:29:17.174093524 +0000 UTC m=+678.361589403" observedRunningTime="2025-10-02 18:29:18.551119046 +0000 UTC m=+679.738614995" watchObservedRunningTime="2025-10-02 18:29:18.553436089 +0000 UTC m=+679.740931978" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.202348 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lf4lx"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.203684 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.214695 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.214762 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.214885 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qxcck" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.216584 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wscv9"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.217472 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wscv9" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.223796 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wscv9"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.228059 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lf4lx"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.228837 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-w96kc" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.242863 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nqkz8"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.243614 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.245445 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vh955" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.257295 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nqkz8"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.283936 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5448z\" (UniqueName: \"kubernetes.io/projected/69b56b01-8070-4514-8168-51f33b7a2d07-kube-api-access-5448z\") pod \"cert-manager-5b446d88c5-wscv9\" (UID: \"69b56b01-8070-4514-8168-51f33b7a2d07\") " pod="cert-manager/cert-manager-5b446d88c5-wscv9" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.284006 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbg6p\" (UniqueName: \"kubernetes.io/projected/52d78023-98ec-431e-a697-0aab89fc7e8a-kube-api-access-dbg6p\") pod \"cert-manager-webhook-5655c58dd6-nqkz8\" (UID: \"52d78023-98ec-431e-a697-0aab89fc7e8a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.284129 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmb82\" (UniqueName: \"kubernetes.io/projected/48294402-b72e-4b64-a227-a4ccb355ef9f-kube-api-access-mmb82\") pod \"cert-manager-cainjector-7f985d654d-lf4lx\" (UID: \"48294402-b72e-4b64-a227-a4ccb355ef9f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.385468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbg6p\" (UniqueName: \"kubernetes.io/projected/52d78023-98ec-431e-a697-0aab89fc7e8a-kube-api-access-dbg6p\") pod \"cert-manager-webhook-5655c58dd6-nqkz8\" (UID: \"52d78023-98ec-431e-a697-0aab89fc7e8a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.385562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmb82\" (UniqueName: \"kubernetes.io/projected/48294402-b72e-4b64-a227-a4ccb355ef9f-kube-api-access-mmb82\") pod \"cert-manager-cainjector-7f985d654d-lf4lx\" (UID: \"48294402-b72e-4b64-a227-a4ccb355ef9f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.385621 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5448z\" (UniqueName: \"kubernetes.io/projected/69b56b01-8070-4514-8168-51f33b7a2d07-kube-api-access-5448z\") pod \"cert-manager-5b446d88c5-wscv9\" (UID: \"69b56b01-8070-4514-8168-51f33b7a2d07\") " pod="cert-manager/cert-manager-5b446d88c5-wscv9" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.408587 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbg6p\" (UniqueName: \"kubernetes.io/projected/52d78023-98ec-431e-a697-0aab89fc7e8a-kube-api-access-dbg6p\") pod \"cert-manager-webhook-5655c58dd6-nqkz8\" (UID: \"52d78023-98ec-431e-a697-0aab89fc7e8a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.408597 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmb82\" (UniqueName: \"kubernetes.io/projected/48294402-b72e-4b64-a227-a4ccb355ef9f-kube-api-access-mmb82\") pod \"cert-manager-cainjector-7f985d654d-lf4lx\" (UID: \"48294402-b72e-4b64-a227-a4ccb355ef9f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.415116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5448z\" (UniqueName: \"kubernetes.io/projected/69b56b01-8070-4514-8168-51f33b7a2d07-kube-api-access-5448z\") pod \"cert-manager-5b446d88c5-wscv9\" (UID: \"69b56b01-8070-4514-8168-51f33b7a2d07\") " pod="cert-manager/cert-manager-5b446d88c5-wscv9" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.530130 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.540788 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wscv9" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.558275 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.853356 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wscv9"] Oct 02 18:29:24 crc kubenswrapper[4909]: I1002 18:29:24.996011 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lf4lx"] Oct 02 18:29:25 crc kubenswrapper[4909]: I1002 18:29:25.091685 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nqkz8"] Oct 02 18:29:25 crc kubenswrapper[4909]: I1002 18:29:25.583142 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wscv9" event={"ID":"69b56b01-8070-4514-8168-51f33b7a2d07","Type":"ContainerStarted","Data":"77783fc3e6b94a970d9f5aacd97a8c38099f40ec9e7159047a531281e0dcf1ba"} Oct 02 18:29:25 crc kubenswrapper[4909]: I1002 18:29:25.584199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" event={"ID":"52d78023-98ec-431e-a697-0aab89fc7e8a","Type":"ContainerStarted","Data":"ff8c250c07985aa104562e50721eeca305e9cd7769b8bc29cfb52ddb78f9cf42"} Oct 02 18:29:25 crc kubenswrapper[4909]: I1002 18:29:25.585258 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" event={"ID":"48294402-b72e-4b64-a227-a4ccb355ef9f","Type":"ContainerStarted","Data":"61ce0aec3bbf37683d387fffd7984d957210cc4c7bdcf6a2ea76cb5bb513d8f1"} Oct 02 18:29:28 crc kubenswrapper[4909]: I1002 18:29:28.349063 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-9btfm" Oct 02 18:29:33 crc kubenswrapper[4909]: I1002 18:29:33.664004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wscv9" event={"ID":"69b56b01-8070-4514-8168-51f33b7a2d07","Type":"ContainerStarted","Data":"4b8acea4555fb6d928d511bb4b3af32f5822366004f9c639901f4c4994b52f38"} Oct 02 18:29:33 crc kubenswrapper[4909]: I1002 18:29:33.677051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" event={"ID":"52d78023-98ec-431e-a697-0aab89fc7e8a","Type":"ContainerStarted","Data":"6e738061af4be432cc113af053d69b94e12a74e9fdb25f2385d848b0ced08fa0"} Oct 02 18:29:33 crc kubenswrapper[4909]: I1002 18:29:33.678063 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:29:33 crc kubenswrapper[4909]: I1002 18:29:33.689368 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-wscv9" podStartSLOduration=1.318153449 podStartE2EDuration="9.689352948s" podCreationTimestamp="2025-10-02 18:29:24 +0000 UTC" firstStartedPulling="2025-10-02 18:29:24.867876889 +0000 UTC m=+686.055372748" lastFinishedPulling="2025-10-02 18:29:33.239076388 +0000 UTC m=+694.426572247" observedRunningTime="2025-10-02 18:29:33.683690411 +0000 UTC m=+694.871186280" watchObservedRunningTime="2025-10-02 18:29:33.689352948 +0000 UTC m=+694.876848817" Oct 02 18:29:33 crc kubenswrapper[4909]: I1002 18:29:33.711943 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" podStartSLOduration=1.577826481 podStartE2EDuration="9.711923825s" podCreationTimestamp="2025-10-02 18:29:24 +0000 UTC" firstStartedPulling="2025-10-02 18:29:25.100331648 +0000 UTC m=+686.287827507" lastFinishedPulling="2025-10-02 18:29:33.234428982 +0000 UTC m=+694.421924851" observedRunningTime="2025-10-02 18:29:33.708684954 +0000 UTC m=+694.896180813" watchObservedRunningTime="2025-10-02 18:29:33.711923825 +0000 UTC m=+694.899419684" Oct 02 18:29:35 crc kubenswrapper[4909]: I1002 18:29:35.692962 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" event={"ID":"48294402-b72e-4b64-a227-a4ccb355ef9f","Type":"ContainerStarted","Data":"f2d9878267148d9bf23f897336bf502bf970041631f04a4097bcd9db18f9ef23"} Oct 02 18:29:35 crc kubenswrapper[4909]: I1002 18:29:35.716761 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-lf4lx" podStartSLOduration=1.26998228 podStartE2EDuration="11.716741507s" podCreationTimestamp="2025-10-02 18:29:24 +0000 UTC" firstStartedPulling="2025-10-02 18:29:25.006100427 +0000 UTC m=+686.193596286" lastFinishedPulling="2025-10-02 18:29:35.452859644 +0000 UTC m=+696.640355513" observedRunningTime="2025-10-02 18:29:35.715814298 +0000 UTC m=+696.903310167" watchObservedRunningTime="2025-10-02 18:29:35.716741507 +0000 UTC m=+696.904237376" Oct 02 18:29:39 crc kubenswrapper[4909]: I1002 18:29:39.563155 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-nqkz8" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.148836 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4"] Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.151632 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.154638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hspqz\" (UniqueName: \"kubernetes.io/projected/7d661e3c-9a8e-4823-98c0-7abf07813d89-kube-api-access-hspqz\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.154735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d661e3c-9a8e-4823-98c0-7abf07813d89-config-volume\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.154808 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d661e3c-9a8e-4823-98c0-7abf07813d89-secret-volume\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.155093 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.157708 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.163063 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4"] Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.256193 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d661e3c-9a8e-4823-98c0-7abf07813d89-secret-volume\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.256297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hspqz\" (UniqueName: \"kubernetes.io/projected/7d661e3c-9a8e-4823-98c0-7abf07813d89-kube-api-access-hspqz\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.256361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d661e3c-9a8e-4823-98c0-7abf07813d89-config-volume\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.257602 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d661e3c-9a8e-4823-98c0-7abf07813d89-config-volume\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.262017 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d661e3c-9a8e-4823-98c0-7abf07813d89-secret-volume\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.273715 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hspqz\" (UniqueName: \"kubernetes.io/projected/7d661e3c-9a8e-4823-98c0-7abf07813d89-kube-api-access-hspqz\") pod \"collect-profiles-29323830-8kpp4\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.467723 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:00 crc kubenswrapper[4909]: I1002 18:30:00.928638 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4"] Oct 02 18:30:00 crc kubenswrapper[4909]: W1002 18:30:00.939637 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d661e3c_9a8e_4823_98c0_7abf07813d89.slice/crio-35877afddec1de8978c3e9247b8342a49f3d3ac6717e0e0990d07ec7180abede WatchSource:0}: Error finding container 35877afddec1de8978c3e9247b8342a49f3d3ac6717e0e0990d07ec7180abede: Status 404 returned error can't find the container with id 35877afddec1de8978c3e9247b8342a49f3d3ac6717e0e0990d07ec7180abede Oct 02 18:30:01 crc kubenswrapper[4909]: I1002 18:30:01.883264 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d661e3c-9a8e-4823-98c0-7abf07813d89" containerID="332a3f19c434e5b1f11cdc97c013bf23272ac622306facdbffbaba594f650af7" exitCode=0 Oct 02 18:30:01 crc kubenswrapper[4909]: I1002 18:30:01.883372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" event={"ID":"7d661e3c-9a8e-4823-98c0-7abf07813d89","Type":"ContainerDied","Data":"332a3f19c434e5b1f11cdc97c013bf23272ac622306facdbffbaba594f650af7"} Oct 02 18:30:01 crc kubenswrapper[4909]: I1002 18:30:01.883849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" event={"ID":"7d661e3c-9a8e-4823-98c0-7abf07813d89","Type":"ContainerStarted","Data":"35877afddec1de8978c3e9247b8342a49f3d3ac6717e0e0990d07ec7180abede"} Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.190471 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.300707 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hspqz\" (UniqueName: \"kubernetes.io/projected/7d661e3c-9a8e-4823-98c0-7abf07813d89-kube-api-access-hspqz\") pod \"7d661e3c-9a8e-4823-98c0-7abf07813d89\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.300771 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d661e3c-9a8e-4823-98c0-7abf07813d89-config-volume\") pod \"7d661e3c-9a8e-4823-98c0-7abf07813d89\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.300821 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d661e3c-9a8e-4823-98c0-7abf07813d89-secret-volume\") pod \"7d661e3c-9a8e-4823-98c0-7abf07813d89\" (UID: \"7d661e3c-9a8e-4823-98c0-7abf07813d89\") " Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.302687 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d661e3c-9a8e-4823-98c0-7abf07813d89-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d661e3c-9a8e-4823-98c0-7abf07813d89" (UID: "7d661e3c-9a8e-4823-98c0-7abf07813d89"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.307974 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d661e3c-9a8e-4823-98c0-7abf07813d89-kube-api-access-hspqz" (OuterVolumeSpecName: "kube-api-access-hspqz") pod "7d661e3c-9a8e-4823-98c0-7abf07813d89" (UID: "7d661e3c-9a8e-4823-98c0-7abf07813d89"). InnerVolumeSpecName "kube-api-access-hspqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.312649 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d661e3c-9a8e-4823-98c0-7abf07813d89-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d661e3c-9a8e-4823-98c0-7abf07813d89" (UID: "7d661e3c-9a8e-4823-98c0-7abf07813d89"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.402517 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d661e3c-9a8e-4823-98c0-7abf07813d89-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.402585 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hspqz\" (UniqueName: \"kubernetes.io/projected/7d661e3c-9a8e-4823-98c0-7abf07813d89-kube-api-access-hspqz\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.402603 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d661e3c-9a8e-4823-98c0-7abf07813d89-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.901523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" event={"ID":"7d661e3c-9a8e-4823-98c0-7abf07813d89","Type":"ContainerDied","Data":"35877afddec1de8978c3e9247b8342a49f3d3ac6717e0e0990d07ec7180abede"} Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.902065 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35877afddec1de8978c3e9247b8342a49f3d3ac6717e0e0990d07ec7180abede" Oct 02 18:30:03 crc kubenswrapper[4909]: I1002 18:30:03.901639 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.623269 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb"] Oct 02 18:30:10 crc kubenswrapper[4909]: E1002 18:30:10.624094 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d661e3c-9a8e-4823-98c0-7abf07813d89" containerName="collect-profiles" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.624118 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d661e3c-9a8e-4823-98c0-7abf07813d89" containerName="collect-profiles" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.624307 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d661e3c-9a8e-4823-98c0-7abf07813d89" containerName="collect-profiles" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.625731 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.630291 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.635787 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb"] Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.706953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt22b\" (UniqueName: \"kubernetes.io/projected/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-kube-api-access-xt22b\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.707081 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-bundle\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.707119 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-util\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.808231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt22b\" (UniqueName: \"kubernetes.io/projected/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-kube-api-access-xt22b\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.808888 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-bundle\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.809477 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-bundle\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.809906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-util\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.810243 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-util\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.817919 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9"] Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.819128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.833276 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt22b\" (UniqueName: \"kubernetes.io/projected/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-kube-api-access-xt22b\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.872010 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9"] Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.910684 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk9vs\" (UniqueName: \"kubernetes.io/projected/1f391df0-3f52-4966-8661-04ecd7d41088-kube-api-access-jk9vs\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.910731 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-util\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.910772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-bundle\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:10 crc kubenswrapper[4909]: I1002 18:30:10.992355 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.015073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-bundle\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.015214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk9vs\" (UniqueName: \"kubernetes.io/projected/1f391df0-3f52-4966-8661-04ecd7d41088-kube-api-access-jk9vs\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.015249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-util\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.015832 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-util\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.015990 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-bundle\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.049420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk9vs\" (UniqueName: \"kubernetes.io/projected/1f391df0-3f52-4966-8661-04ecd7d41088-kube-api-access-jk9vs\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.134670 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.215997 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb"] Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.384847 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9"] Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.955343 4909 generic.go:334] "Generic (PLEG): container finished" podID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerID="1ec68b7df78f8464e800683d3a9af7bfa65510f881c69433be2a425a8748c5d9" exitCode=0 Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.955437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" event={"ID":"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01","Type":"ContainerDied","Data":"1ec68b7df78f8464e800683d3a9af7bfa65510f881c69433be2a425a8748c5d9"} Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.955477 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" event={"ID":"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01","Type":"ContainerStarted","Data":"50a9a8094f9d7c91336d3710aff45c03e9ab2152889bd7d2f9868eb8adf84940"} Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.960422 4909 generic.go:334] "Generic (PLEG): container finished" podID="1f391df0-3f52-4966-8661-04ecd7d41088" containerID="3b97ed8c038f65f4e6146a7fcbda93845f591ed192eaf34791946c7cf0b8cfd4" exitCode=0 Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.960466 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" event={"ID":"1f391df0-3f52-4966-8661-04ecd7d41088","Type":"ContainerDied","Data":"3b97ed8c038f65f4e6146a7fcbda93845f591ed192eaf34791946c7cf0b8cfd4"} Oct 02 18:30:11 crc kubenswrapper[4909]: I1002 18:30:11.960491 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" event={"ID":"1f391df0-3f52-4966-8661-04ecd7d41088","Type":"ContainerStarted","Data":"f84617b65e850484b0ef7e542a3c5eddec2fff3e76bb6ea94ed08ce13e96ffdf"} Oct 02 18:30:14 crc kubenswrapper[4909]: I1002 18:30:14.984708 4909 generic.go:334] "Generic (PLEG): container finished" podID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerID="32d6ad1c50c051c7ffdcb90574b4bdb713e7670695cf924aee6044f6c939c610" exitCode=0 Oct 02 18:30:14 crc kubenswrapper[4909]: I1002 18:30:14.984785 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" event={"ID":"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01","Type":"ContainerDied","Data":"32d6ad1c50c051c7ffdcb90574b4bdb713e7670695cf924aee6044f6c939c610"} Oct 02 18:30:14 crc kubenswrapper[4909]: I1002 18:30:14.988285 4909 generic.go:334] "Generic (PLEG): container finished" podID="1f391df0-3f52-4966-8661-04ecd7d41088" containerID="6cfbcb665b111ecc0a20c91e160c6f85d965133f5b7c1caf60ce05e04c4ca428" exitCode=0 Oct 02 18:30:14 crc kubenswrapper[4909]: I1002 18:30:14.988334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" event={"ID":"1f391df0-3f52-4966-8661-04ecd7d41088","Type":"ContainerDied","Data":"6cfbcb665b111ecc0a20c91e160c6f85d965133f5b7c1caf60ce05e04c4ca428"} Oct 02 18:30:15 crc kubenswrapper[4909]: I1002 18:30:15.998619 4909 generic.go:334] "Generic (PLEG): container finished" podID="1f391df0-3f52-4966-8661-04ecd7d41088" containerID="6e60611f96d3c44666085c494e002821a03f1cc81fd1cdbccb212ce8a508bb66" exitCode=0 Oct 02 18:30:15 crc kubenswrapper[4909]: I1002 18:30:15.998704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" event={"ID":"1f391df0-3f52-4966-8661-04ecd7d41088","Type":"ContainerDied","Data":"6e60611f96d3c44666085c494e002821a03f1cc81fd1cdbccb212ce8a508bb66"} Oct 02 18:30:16 crc kubenswrapper[4909]: I1002 18:30:16.001318 4909 generic.go:334] "Generic (PLEG): container finished" podID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerID="23e14c29184748171fb7b5bf851eaa17b046bbf18634a1a9b2c805cd5386d489" exitCode=0 Oct 02 18:30:16 crc kubenswrapper[4909]: I1002 18:30:16.001379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" event={"ID":"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01","Type":"ContainerDied","Data":"23e14c29184748171fb7b5bf851eaa17b046bbf18634a1a9b2c805cd5386d489"} Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.366955 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.379274 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.515634 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-util\") pod \"1f391df0-3f52-4966-8661-04ecd7d41088\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.515744 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-bundle\") pod \"1f391df0-3f52-4966-8661-04ecd7d41088\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.515797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-bundle\") pod \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.515820 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt22b\" (UniqueName: \"kubernetes.io/projected/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-kube-api-access-xt22b\") pod \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.515867 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk9vs\" (UniqueName: \"kubernetes.io/projected/1f391df0-3f52-4966-8661-04ecd7d41088-kube-api-access-jk9vs\") pod \"1f391df0-3f52-4966-8661-04ecd7d41088\" (UID: \"1f391df0-3f52-4966-8661-04ecd7d41088\") " Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.515906 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-util\") pod \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\" (UID: \"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01\") " Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.516676 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-bundle" (OuterVolumeSpecName: "bundle") pod "34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" (UID: "34d4acd8-da3b-4b4d-801f-4f7ccc6cac01"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.517613 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-bundle" (OuterVolumeSpecName: "bundle") pod "1f391df0-3f52-4966-8661-04ecd7d41088" (UID: "1f391df0-3f52-4966-8661-04ecd7d41088"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.530332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f391df0-3f52-4966-8661-04ecd7d41088-kube-api-access-jk9vs" (OuterVolumeSpecName: "kube-api-access-jk9vs") pod "1f391df0-3f52-4966-8661-04ecd7d41088" (UID: "1f391df0-3f52-4966-8661-04ecd7d41088"). InnerVolumeSpecName "kube-api-access-jk9vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.531228 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-kube-api-access-xt22b" (OuterVolumeSpecName: "kube-api-access-xt22b") pod "34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" (UID: "34d4acd8-da3b-4b4d-801f-4f7ccc6cac01"). InnerVolumeSpecName "kube-api-access-xt22b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.631742 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt22b\" (UniqueName: \"kubernetes.io/projected/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-kube-api-access-xt22b\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.631771 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk9vs\" (UniqueName: \"kubernetes.io/projected/1f391df0-3f52-4966-8661-04ecd7d41088-kube-api-access-jk9vs\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.631782 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.631790 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.785118 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-util" (OuterVolumeSpecName: "util") pod "1f391df0-3f52-4966-8661-04ecd7d41088" (UID: "1f391df0-3f52-4966-8661-04ecd7d41088"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.833915 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f391df0-3f52-4966-8661-04ecd7d41088-util\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.862225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-util" (OuterVolumeSpecName: "util") pod "34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" (UID: "34d4acd8-da3b-4b4d-801f-4f7ccc6cac01"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:30:17 crc kubenswrapper[4909]: I1002 18:30:17.935013 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d4acd8-da3b-4b4d-801f-4f7ccc6cac01-util\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:18 crc kubenswrapper[4909]: I1002 18:30:18.023654 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" event={"ID":"34d4acd8-da3b-4b4d-801f-4f7ccc6cac01","Type":"ContainerDied","Data":"50a9a8094f9d7c91336d3710aff45c03e9ab2152889bd7d2f9868eb8adf84940"} Oct 02 18:30:18 crc kubenswrapper[4909]: I1002 18:30:18.023721 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a9a8094f9d7c91336d3710aff45c03e9ab2152889bd7d2f9868eb8adf84940" Oct 02 18:30:18 crc kubenswrapper[4909]: I1002 18:30:18.023814 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb" Oct 02 18:30:18 crc kubenswrapper[4909]: I1002 18:30:18.031007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" event={"ID":"1f391df0-3f52-4966-8661-04ecd7d41088","Type":"ContainerDied","Data":"f84617b65e850484b0ef7e542a3c5eddec2fff3e76bb6ea94ed08ce13e96ffdf"} Oct 02 18:30:18 crc kubenswrapper[4909]: I1002 18:30:18.031131 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f84617b65e850484b0ef7e542a3c5eddec2fff3e76bb6ea94ed08ce13e96ffdf" Oct 02 18:30:18 crc kubenswrapper[4909]: I1002 18:30:18.031242 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9" Oct 02 18:30:23 crc kubenswrapper[4909]: I1002 18:30:23.055255 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:30:23 crc kubenswrapper[4909]: I1002 18:30:23.056188 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:30:26 crc kubenswrapper[4909]: I1002 18:30:26.927969 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xhtq7"] Oct 02 18:30:26 crc kubenswrapper[4909]: I1002 18:30:26.928455 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" podUID="df0eb575-3d34-41d8-b6ac-12225721c074" containerName="controller-manager" containerID="cri-o://8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc" gracePeriod=30 Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.036706 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq"] Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.037091 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" podUID="b88a862d-943e-4321-9dac-8bc48701e1d6" containerName="route-controller-manager" containerID="cri-o://83d170020d3b48b19b95904733fb705249c724f2dee9f6299d72e30872769683" gracePeriod=30 Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.812211 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7"] Oct 02 18:30:27 crc kubenswrapper[4909]: E1002 18:30:27.813514 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="extract" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813533 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="extract" Oct 02 18:30:27 crc kubenswrapper[4909]: E1002 18:30:27.813550 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="extract" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813558 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="extract" Oct 02 18:30:27 crc kubenswrapper[4909]: E1002 18:30:27.813577 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="pull" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813586 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="pull" Oct 02 18:30:27 crc kubenswrapper[4909]: E1002 18:30:27.813598 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="util" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813607 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="util" Oct 02 18:30:27 crc kubenswrapper[4909]: E1002 18:30:27.813620 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="util" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813628 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="util" Oct 02 18:30:27 crc kubenswrapper[4909]: E1002 18:30:27.813641 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="pull" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813650 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="pull" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813801 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f391df0-3f52-4966-8661-04ecd7d41088" containerName="extract" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.813818 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d4acd8-da3b-4b4d-801f-4f7ccc6cac01" containerName="extract" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.814765 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.833725 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-7g6pq" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.833826 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.834082 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.834220 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.834399 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.834654 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.845993 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7"] Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.975998 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.992667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbp5\" (UniqueName: \"kubernetes.io/projected/32aac12f-cd5f-4a59-8b82-051057ed0e70-kube-api-access-jtbp5\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.992735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/32aac12f-cd5f-4a59-8b82-051057ed0e70-manager-config\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.992762 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-apiservice-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.992802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:27 crc kubenswrapper[4909]: I1002 18:30:27.992821 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-webhook-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.025504 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7797bcc454-hfj79"] Oct 02 18:30:28 crc kubenswrapper[4909]: E1002 18:30:28.025880 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0eb575-3d34-41d8-b6ac-12225721c074" containerName="controller-manager" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.025901 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0eb575-3d34-41d8-b6ac-12225721c074" containerName="controller-manager" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.026066 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0eb575-3d34-41d8-b6ac-12225721c074" containerName="controller-manager" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.026711 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.040964 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7797bcc454-hfj79"] Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093327 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-config\") pod \"df0eb575-3d34-41d8-b6ac-12225721c074\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gspth\" (UniqueName: \"kubernetes.io/projected/df0eb575-3d34-41d8-b6ac-12225721c074-kube-api-access-gspth\") pod \"df0eb575-3d34-41d8-b6ac-12225721c074\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093416 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-proxy-ca-bundles\") pod \"df0eb575-3d34-41d8-b6ac-12225721c074\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093465 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df0eb575-3d34-41d8-b6ac-12225721c074-serving-cert\") pod \"df0eb575-3d34-41d8-b6ac-12225721c074\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093514 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-client-ca\") pod \"df0eb575-3d34-41d8-b6ac-12225721c074\" (UID: \"df0eb575-3d34-41d8-b6ac-12225721c074\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093683 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbp5\" (UniqueName: \"kubernetes.io/projected/32aac12f-cd5f-4a59-8b82-051057ed0e70-kube-api-access-jtbp5\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093725 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/32aac12f-cd5f-4a59-8b82-051057ed0e70-manager-config\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-apiservice-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093785 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-webhook-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.093805 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.095131 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-config" (OuterVolumeSpecName: "config") pod "df0eb575-3d34-41d8-b6ac-12225721c074" (UID: "df0eb575-3d34-41d8-b6ac-12225721c074"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.100831 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "df0eb575-3d34-41d8-b6ac-12225721c074" (UID: "df0eb575-3d34-41d8-b6ac-12225721c074"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.101398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/32aac12f-cd5f-4a59-8b82-051057ed0e70-manager-config\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.101597 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-client-ca" (OuterVolumeSpecName: "client-ca") pod "df0eb575-3d34-41d8-b6ac-12225721c074" (UID: "df0eb575-3d34-41d8-b6ac-12225721c074"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.107436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-apiservice-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.110678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.111909 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0eb575-3d34-41d8-b6ac-12225721c074-kube-api-access-gspth" (OuterVolumeSpecName: "kube-api-access-gspth") pod "df0eb575-3d34-41d8-b6ac-12225721c074" (UID: "df0eb575-3d34-41d8-b6ac-12225721c074"). InnerVolumeSpecName "kube-api-access-gspth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.112584 4909 generic.go:334] "Generic (PLEG): container finished" podID="df0eb575-3d34-41d8-b6ac-12225721c074" containerID="8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc" exitCode=0 Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.112881 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.112880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" event={"ID":"df0eb575-3d34-41d8-b6ac-12225721c074","Type":"ContainerDied","Data":"8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc"} Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.113162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xhtq7" event={"ID":"df0eb575-3d34-41d8-b6ac-12225721c074","Type":"ContainerDied","Data":"bf22872882d10fefdcf13697cbc6dc13a75ebf5711d5419d244605b2d02c5cee"} Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.113199 4909 scope.go:117] "RemoveContainer" containerID="8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.113297 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0eb575-3d34-41d8-b6ac-12225721c074-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df0eb575-3d34-41d8-b6ac-12225721c074" (UID: "df0eb575-3d34-41d8-b6ac-12225721c074"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.130004 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32aac12f-cd5f-4a59-8b82-051057ed0e70-webhook-cert\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.133284 4909 generic.go:334] "Generic (PLEG): container finished" podID="b88a862d-943e-4321-9dac-8bc48701e1d6" containerID="83d170020d3b48b19b95904733fb705249c724f2dee9f6299d72e30872769683" exitCode=0 Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.133331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" event={"ID":"b88a862d-943e-4321-9dac-8bc48701e1d6","Type":"ContainerDied","Data":"83d170020d3b48b19b95904733fb705249c724f2dee9f6299d72e30872769683"} Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.133357 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" event={"ID":"b88a862d-943e-4321-9dac-8bc48701e1d6","Type":"ContainerDied","Data":"2be11a3844183f711e996db9a6798dc41d9989c51e2e3c10c5d6d9e73860f8fa"} Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.133370 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be11a3844183f711e996db9a6798dc41d9989c51e2e3c10c5d6d9e73860f8fa" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.135466 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbp5\" (UniqueName: \"kubernetes.io/projected/32aac12f-cd5f-4a59-8b82-051057ed0e70-kube-api-access-jtbp5\") pod \"loki-operator-controller-manager-576dc5b57d-njbh7\" (UID: \"32aac12f-cd5f-4a59-8b82-051057ed0e70\") " pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.173366 4909 scope.go:117] "RemoveContainer" containerID="8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc" Oct 02 18:30:28 crc kubenswrapper[4909]: E1002 18:30:28.173647 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc\": container with ID starting with 8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc not found: ID does not exist" containerID="8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.173674 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc"} err="failed to get container status \"8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc\": rpc error: code = NotFound desc = could not find container \"8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc\": container with ID starting with 8c0f0d14d8e2d1a9a49716dfd4f246675620893dbd5e9eafcd0e62f4638fb5fc not found: ID does not exist" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.177430 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.195811 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196360 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-config\") pod \"b88a862d-943e-4321-9dac-8bc48701e1d6\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196395 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88a862d-943e-4321-9dac-8bc48701e1d6-serving-cert\") pod \"b88a862d-943e-4321-9dac-8bc48701e1d6\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196461 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf82t\" (UniqueName: \"kubernetes.io/projected/b88a862d-943e-4321-9dac-8bc48701e1d6-kube-api-access-hf82t\") pod \"b88a862d-943e-4321-9dac-8bc48701e1d6\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196496 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-client-ca\") pod \"b88a862d-943e-4321-9dac-8bc48701e1d6\" (UID: \"b88a862d-943e-4321-9dac-8bc48701e1d6\") " Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-proxy-ca-bundles\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196607 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdkc\" (UniqueName: \"kubernetes.io/projected/cb484617-f6e4-4412-868f-ead2b91c20ec-kube-api-access-6vdkc\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-client-ca\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196659 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-config\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb484617-f6e4-4412-868f-ead2b91c20ec-serving-cert\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196721 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196732 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196741 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gspth\" (UniqueName: \"kubernetes.io/projected/df0eb575-3d34-41d8-b6ac-12225721c074-kube-api-access-gspth\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196749 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df0eb575-3d34-41d8-b6ac-12225721c074-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.196760 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df0eb575-3d34-41d8-b6ac-12225721c074-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.197498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-config" (OuterVolumeSpecName: "config") pod "b88a862d-943e-4321-9dac-8bc48701e1d6" (UID: "b88a862d-943e-4321-9dac-8bc48701e1d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.198192 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "b88a862d-943e-4321-9dac-8bc48701e1d6" (UID: "b88a862d-943e-4321-9dac-8bc48701e1d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.215772 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88a862d-943e-4321-9dac-8bc48701e1d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b88a862d-943e-4321-9dac-8bc48701e1d6" (UID: "b88a862d-943e-4321-9dac-8bc48701e1d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.222692 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88a862d-943e-4321-9dac-8bc48701e1d6-kube-api-access-hf82t" (OuterVolumeSpecName: "kube-api-access-hf82t") pod "b88a862d-943e-4321-9dac-8bc48701e1d6" (UID: "b88a862d-943e-4321-9dac-8bc48701e1d6"). InnerVolumeSpecName "kube-api-access-hf82t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298407 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb484617-f6e4-4412-868f-ead2b91c20ec-serving-cert\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-proxy-ca-bundles\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298597 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdkc\" (UniqueName: \"kubernetes.io/projected/cb484617-f6e4-4412-868f-ead2b91c20ec-kube-api-access-6vdkc\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-client-ca\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-config\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298755 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88a862d-943e-4321-9dac-8bc48701e1d6-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298781 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf82t\" (UniqueName: \"kubernetes.io/projected/b88a862d-943e-4321-9dac-8bc48701e1d6-kube-api-access-hf82t\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298794 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.298803 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88a862d-943e-4321-9dac-8bc48701e1d6-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.300509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-proxy-ca-bundles\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.300572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-config\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.301245 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-client-ca\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.304789 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb484617-f6e4-4412-868f-ead2b91c20ec-serving-cert\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.331725 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdkc\" (UniqueName: \"kubernetes.io/projected/cb484617-f6e4-4412-868f-ead2b91c20ec-kube-api-access-6vdkc\") pod \"controller-manager-7797bcc454-hfj79\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.358482 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.461139 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xhtq7"] Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.470582 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xhtq7"] Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.577816 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7"] Oct 02 18:30:28 crc kubenswrapper[4909]: W1002 18:30:28.583571 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32aac12f_cd5f_4a59_8b82_051057ed0e70.slice/crio-63118c92db7ae6e2950407689d2a30ff0258805c217dffe1f48db4a52c91d833 WatchSource:0}: Error finding container 63118c92db7ae6e2950407689d2a30ff0258805c217dffe1f48db4a52c91d833: Status 404 returned error can't find the container with id 63118c92db7ae6e2950407689d2a30ff0258805c217dffe1f48db4a52c91d833 Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.660099 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7797bcc454-hfj79"] Oct 02 18:30:28 crc kubenswrapper[4909]: I1002 18:30:28.974235 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7797bcc454-hfj79"] Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.034310 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx"] Oct 02 18:30:29 crc kubenswrapper[4909]: E1002 18:30:29.034552 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88a862d-943e-4321-9dac-8bc48701e1d6" containerName="route-controller-manager" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.034566 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88a862d-943e-4321-9dac-8bc48701e1d6" containerName="route-controller-manager" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.034674 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88a862d-943e-4321-9dac-8bc48701e1d6" containerName="route-controller-manager" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.035102 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.052472 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx"] Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.111918 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844e791c-e30c-4548-950d-9d5f84fd8005-config\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.111990 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/844e791c-e30c-4548-950d-9d5f84fd8005-client-ca\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.112017 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/844e791c-e30c-4548-950d-9d5f84fd8005-serving-cert\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.112049 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdvm\" (UniqueName: \"kubernetes.io/projected/844e791c-e30c-4548-950d-9d5f84fd8005-kube-api-access-jcdvm\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.142210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" event={"ID":"32aac12f-cd5f-4a59-8b82-051057ed0e70","Type":"ContainerStarted","Data":"63118c92db7ae6e2950407689d2a30ff0258805c217dffe1f48db4a52c91d833"} Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.145604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" event={"ID":"cb484617-f6e4-4412-868f-ead2b91c20ec","Type":"ContainerStarted","Data":"48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b"} Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.145650 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.145661 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" event={"ID":"cb484617-f6e4-4412-868f-ead2b91c20ec","Type":"ContainerStarted","Data":"384a2dffea13ca1e6327982d243f253a92dcd0dc544b0dd40a3f8fda6b853355"} Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.146272 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.155490 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.207934 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" podStartSLOduration=3.207910648 podStartE2EDuration="3.207910648s" podCreationTimestamp="2025-10-02 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:30:29.179074799 +0000 UTC m=+750.366570668" watchObservedRunningTime="2025-10-02 18:30:29.207910648 +0000 UTC m=+750.395406507" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.213315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/844e791c-e30c-4548-950d-9d5f84fd8005-client-ca\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.213372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/844e791c-e30c-4548-950d-9d5f84fd8005-serving-cert\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.213397 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdvm\" (UniqueName: \"kubernetes.io/projected/844e791c-e30c-4548-950d-9d5f84fd8005-kube-api-access-jcdvm\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.213468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844e791c-e30c-4548-950d-9d5f84fd8005-config\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.214996 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/844e791c-e30c-4548-950d-9d5f84fd8005-client-ca\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.215192 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844e791c-e30c-4548-950d-9d5f84fd8005-config\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.221601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/844e791c-e30c-4548-950d-9d5f84fd8005-serving-cert\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.225655 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq"] Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.230132 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vsdsq"] Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.243879 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdvm\" (UniqueName: \"kubernetes.io/projected/844e791c-e30c-4548-950d-9d5f84fd8005-kube-api-access-jcdvm\") pod \"route-controller-manager-8644b89d4f-7rkcx\" (UID: \"844e791c-e30c-4548-950d-9d5f84fd8005\") " pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.350416 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.616754 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88a862d-943e-4321-9dac-8bc48701e1d6" path="/var/lib/kubelet/pods/b88a862d-943e-4321-9dac-8bc48701e1d6/volumes" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.617932 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0eb575-3d34-41d8-b6ac-12225721c074" path="/var/lib/kubelet/pods/df0eb575-3d34-41d8-b6ac-12225721c074/volumes" Oct 02 18:30:29 crc kubenswrapper[4909]: I1002 18:30:29.618609 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx"] Oct 02 18:30:29 crc kubenswrapper[4909]: W1002 18:30:29.621678 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod844e791c_e30c_4548_950d_9d5f84fd8005.slice/crio-88c0e8389a28b077c4d0c329d2d576f3d7edf32a1baf654da3d3e1ea0a6be2c8 WatchSource:0}: Error finding container 88c0e8389a28b077c4d0c329d2d576f3d7edf32a1baf654da3d3e1ea0a6be2c8: Status 404 returned error can't find the container with id 88c0e8389a28b077c4d0c329d2d576f3d7edf32a1baf654da3d3e1ea0a6be2c8 Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.187118 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" podUID="cb484617-f6e4-4412-868f-ead2b91c20ec" containerName="controller-manager" containerID="cri-o://48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b" gracePeriod=30 Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.187493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" event={"ID":"844e791c-e30c-4548-950d-9d5f84fd8005","Type":"ContainerStarted","Data":"c1f9b90a5739a2cb3f7007f157a384402974f899eb7ffa3a63ebd504120e76fb"} Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.187521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" event={"ID":"844e791c-e30c-4548-950d-9d5f84fd8005","Type":"ContainerStarted","Data":"88c0e8389a28b077c4d0c329d2d576f3d7edf32a1baf654da3d3e1ea0a6be2c8"} Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.189016 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.702515 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.725956 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" podStartSLOduration=1.725939918 podStartE2EDuration="1.725939918s" podCreationTimestamp="2025-10-02 18:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:30:30.22218878 +0000 UTC m=+751.409684679" watchObservedRunningTime="2025-10-02 18:30:30.725939918 +0000 UTC m=+751.913435777" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.737931 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vdkc\" (UniqueName: \"kubernetes.io/projected/cb484617-f6e4-4412-868f-ead2b91c20ec-kube-api-access-6vdkc\") pod \"cb484617-f6e4-4412-868f-ead2b91c20ec\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.737987 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-proxy-ca-bundles\") pod \"cb484617-f6e4-4412-868f-ead2b91c20ec\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.738008 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb484617-f6e4-4412-868f-ead2b91c20ec-serving-cert\") pod \"cb484617-f6e4-4412-868f-ead2b91c20ec\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.738054 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-config\") pod \"cb484617-f6e4-4412-868f-ead2b91c20ec\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.738110 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-client-ca\") pod \"cb484617-f6e4-4412-868f-ead2b91c20ec\" (UID: \"cb484617-f6e4-4412-868f-ead2b91c20ec\") " Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.739238 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb484617-f6e4-4412-868f-ead2b91c20ec" (UID: "cb484617-f6e4-4412-868f-ead2b91c20ec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.739320 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cb484617-f6e4-4412-868f-ead2b91c20ec" (UID: "cb484617-f6e4-4412-868f-ead2b91c20ec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.739996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-config" (OuterVolumeSpecName: "config") pod "cb484617-f6e4-4412-868f-ead2b91c20ec" (UID: "cb484617-f6e4-4412-868f-ead2b91c20ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.743092 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69ccd9659f-6mk8b"] Oct 02 18:30:30 crc kubenswrapper[4909]: E1002 18:30:30.743692 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb484617-f6e4-4412-868f-ead2b91c20ec" containerName="controller-manager" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.743775 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb484617-f6e4-4412-868f-ead2b91c20ec" containerName="controller-manager" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.743954 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb484617-f6e4-4412-868f-ead2b91c20ec" containerName="controller-manager" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.744588 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.771263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb484617-f6e4-4412-868f-ead2b91c20ec-kube-api-access-6vdkc" (OuterVolumeSpecName: "kube-api-access-6vdkc") pod "cb484617-f6e4-4412-868f-ead2b91c20ec" (UID: "cb484617-f6e4-4412-868f-ead2b91c20ec"). InnerVolumeSpecName "kube-api-access-6vdkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.774124 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb484617-f6e4-4412-868f-ead2b91c20ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb484617-f6e4-4412-868f-ead2b91c20ec" (UID: "cb484617-f6e4-4412-868f-ead2b91c20ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.825532 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69ccd9659f-6mk8b"] Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.843220 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vdkc\" (UniqueName: \"kubernetes.io/projected/cb484617-f6e4-4412-868f-ead2b91c20ec-kube-api-access-6vdkc\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.843265 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.843275 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb484617-f6e4-4412-868f-ead2b91c20ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.843288 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.843299 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb484617-f6e4-4412-868f-ead2b91c20ec-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.879596 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8644b89d4f-7rkcx" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.947743 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-proxy-ca-bundles\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.947835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-config\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.947876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvck4\" (UniqueName: \"kubernetes.io/projected/04511814-58c2-4388-8d18-7f8db99af677-kube-api-access-bvck4\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.947904 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-client-ca\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:30 crc kubenswrapper[4909]: I1002 18:30:30.947929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04511814-58c2-4388-8d18-7f8db99af677-serving-cert\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.049114 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvck4\" (UniqueName: \"kubernetes.io/projected/04511814-58c2-4388-8d18-7f8db99af677-kube-api-access-bvck4\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.049710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-client-ca\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.049741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04511814-58c2-4388-8d18-7f8db99af677-serving-cert\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.049812 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-proxy-ca-bundles\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.049889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-config\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.050804 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-client-ca\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.051338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-proxy-ca-bundles\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.051870 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04511814-58c2-4388-8d18-7f8db99af677-config\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.055072 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-8958c8b87-fqrjj"] Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.055426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04511814-58c2-4388-8d18-7f8db99af677-serving-cert\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.056006 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" Oct 02 18:30:31 crc kubenswrapper[4909]: W1002 18:30:31.058473 4909 reflector.go:561] object-"openshift-logging"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-logging": no relationship found between node 'crc' and this object Oct 02 18:30:31 crc kubenswrapper[4909]: E1002 18:30:31.058534 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-logging\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-logging\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 18:30:31 crc kubenswrapper[4909]: W1002 18:30:31.059231 4909 reflector.go:561] object-"openshift-logging"/"cluster-logging-operator-dockercfg-wwz5d": failed to list *v1.Secret: secrets "cluster-logging-operator-dockercfg-wwz5d" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-logging": no relationship found between node 'crc' and this object Oct 02 18:30:31 crc kubenswrapper[4909]: E1002 18:30:31.059274 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-logging\"/\"cluster-logging-operator-dockercfg-wwz5d\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-logging-operator-dockercfg-wwz5d\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-logging\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 18:30:31 crc kubenswrapper[4909]: W1002 18:30:31.059317 4909 reflector.go:561] object-"openshift-logging"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-logging": no relationship found between node 'crc' and this object Oct 02 18:30:31 crc kubenswrapper[4909]: E1002 18:30:31.059331 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-logging\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-logging\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.069937 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-8958c8b87-fqrjj"] Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.073831 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvck4\" (UniqueName: \"kubernetes.io/projected/04511814-58c2-4388-8d18-7f8db99af677-kube-api-access-bvck4\") pod \"controller-manager-69ccd9659f-6mk8b\" (UID: \"04511814-58c2-4388-8d18-7f8db99af677\") " pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.109039 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.196824 4909 generic.go:334] "Generic (PLEG): container finished" podID="cb484617-f6e4-4412-868f-ead2b91c20ec" containerID="48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b" exitCode=0 Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.196912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" event={"ID":"cb484617-f6e4-4412-868f-ead2b91c20ec","Type":"ContainerDied","Data":"48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b"} Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.196954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" event={"ID":"cb484617-f6e4-4412-868f-ead2b91c20ec","Type":"ContainerDied","Data":"384a2dffea13ca1e6327982d243f253a92dcd0dc544b0dd40a3f8fda6b853355"} Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.196973 4909 scope.go:117] "RemoveContainer" containerID="48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.196923 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7797bcc454-hfj79" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.248756 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7797bcc454-hfj79"] Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.252687 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8lm\" (UniqueName: \"kubernetes.io/projected/63728faa-74d7-4f8c-baab-348f0a26da5b-kube-api-access-xz8lm\") pod \"cluster-logging-operator-8958c8b87-fqrjj\" (UID: \"63728faa-74d7-4f8c-baab-348f0a26da5b\") " pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.252705 4909 scope.go:117] "RemoveContainer" containerID="48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b" Oct 02 18:30:31 crc kubenswrapper[4909]: E1002 18:30:31.253168 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b\": container with ID starting with 48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b not found: ID does not exist" containerID="48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.253215 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b"} err="failed to get container status \"48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b\": rpc error: code = NotFound desc = could not find container \"48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b\": container with ID starting with 48ad2d59b90a089cbc7fbecbf4ed52b0f25881715d462386b5040c4bc70f248b not found: ID does not exist" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.259055 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7797bcc454-hfj79"] Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.353987 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8lm\" (UniqueName: \"kubernetes.io/projected/63728faa-74d7-4f8c-baab-348f0a26da5b-kube-api-access-xz8lm\") pod \"cluster-logging-operator-8958c8b87-fqrjj\" (UID: \"63728faa-74d7-4f8c-baab-348f0a26da5b\") " pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.537738 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69ccd9659f-6mk8b"] Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.622131 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb484617-f6e4-4412-868f-ead2b91c20ec" path="/var/lib/kubelet/pods/cb484617-f6e4-4412-868f-ead2b91c20ec/volumes" Oct 02 18:30:31 crc kubenswrapper[4909]: I1002 18:30:31.965075 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.108925 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-wwz5d" Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.214767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" event={"ID":"04511814-58c2-4388-8d18-7f8db99af677","Type":"ContainerStarted","Data":"ea98f56965acce2abf1f80708e9171c31afd3a367356f60819654a6673c92eb4"} Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.215773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" event={"ID":"04511814-58c2-4388-8d18-7f8db99af677","Type":"ContainerStarted","Data":"feed756c03831eaa708b6b626dce1a3efa2295360297344bac850abe0e0796ef"} Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.252687 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" podStartSLOduration=3.252661292 podStartE2EDuration="3.252661292s" podCreationTimestamp="2025-10-02 18:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:30:32.239957571 +0000 UTC m=+753.427453440" watchObservedRunningTime="2025-10-02 18:30:32.252661292 +0000 UTC m=+753.440157151" Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.407799 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.421682 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8lm\" (UniqueName: \"kubernetes.io/projected/63728faa-74d7-4f8c-baab-348f0a26da5b-kube-api-access-xz8lm\") pod \"cluster-logging-operator-8958c8b87-fqrjj\" (UID: \"63728faa-74d7-4f8c-baab-348f0a26da5b\") " pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" Oct 02 18:30:32 crc kubenswrapper[4909]: I1002 18:30:32.569365 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" Oct 02 18:30:33 crc kubenswrapper[4909]: I1002 18:30:33.224877 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:33 crc kubenswrapper[4909]: I1002 18:30:33.229974 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69ccd9659f-6mk8b" Oct 02 18:30:35 crc kubenswrapper[4909]: I1002 18:30:35.153578 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-8958c8b87-fqrjj"] Oct 02 18:30:35 crc kubenswrapper[4909]: W1002 18:30:35.169159 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63728faa_74d7_4f8c_baab_348f0a26da5b.slice/crio-cb90d63116c2254e1a9ece60f3ea5a6286f03b39d330572e34aa822d63ff5a9a WatchSource:0}: Error finding container cb90d63116c2254e1a9ece60f3ea5a6286f03b39d330572e34aa822d63ff5a9a: Status 404 returned error can't find the container with id cb90d63116c2254e1a9ece60f3ea5a6286f03b39d330572e34aa822d63ff5a9a Oct 02 18:30:35 crc kubenswrapper[4909]: I1002 18:30:35.240707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" event={"ID":"63728faa-74d7-4f8c-baab-348f0a26da5b","Type":"ContainerStarted","Data":"cb90d63116c2254e1a9ece60f3ea5a6286f03b39d330572e34aa822d63ff5a9a"} Oct 02 18:30:35 crc kubenswrapper[4909]: I1002 18:30:35.243593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" event={"ID":"32aac12f-cd5f-4a59-8b82-051057ed0e70","Type":"ContainerStarted","Data":"8d497835dfa4edc532ede707298d21814d1833d4dbfec92f987a3d982ec5d2bb"} Oct 02 18:30:36 crc kubenswrapper[4909]: I1002 18:30:36.905719 4909 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 18:30:45 crc kubenswrapper[4909]: I1002 18:30:45.331817 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" event={"ID":"63728faa-74d7-4f8c-baab-348f0a26da5b","Type":"ContainerStarted","Data":"de75d010d6b848790a6106847c4d5796a9f8e82bbf924de7e159a13b0869679e"} Oct 02 18:30:45 crc kubenswrapper[4909]: I1002 18:30:45.335789 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" event={"ID":"32aac12f-cd5f-4a59-8b82-051057ed0e70","Type":"ContainerStarted","Data":"e51be25e124613e40fc0384c27656a8c938fe1a8f900314f713d6ec10d7704fb"} Oct 02 18:30:45 crc kubenswrapper[4909]: I1002 18:30:45.336067 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:45 crc kubenswrapper[4909]: I1002 18:30:45.341597 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" Oct 02 18:30:45 crc kubenswrapper[4909]: I1002 18:30:45.353696 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-8958c8b87-fqrjj" podStartSLOduration=4.900583329 podStartE2EDuration="14.3536811s" podCreationTimestamp="2025-10-02 18:30:31 +0000 UTC" firstStartedPulling="2025-10-02 18:30:35.172068345 +0000 UTC m=+756.359564204" lastFinishedPulling="2025-10-02 18:30:44.625166076 +0000 UTC m=+765.812661975" observedRunningTime="2025-10-02 18:30:45.349164877 +0000 UTC m=+766.536660746" watchObservedRunningTime="2025-10-02 18:30:45.3536811 +0000 UTC m=+766.541176959" Oct 02 18:30:45 crc kubenswrapper[4909]: I1002 18:30:45.387394 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-576dc5b57d-njbh7" podStartSLOduration=2.33299747 podStartE2EDuration="18.387373371s" podCreationTimestamp="2025-10-02 18:30:27 +0000 UTC" firstStartedPulling="2025-10-02 18:30:28.589088532 +0000 UTC m=+749.776584391" lastFinishedPulling="2025-10-02 18:30:44.643464413 +0000 UTC m=+765.830960292" observedRunningTime="2025-10-02 18:30:45.380696401 +0000 UTC m=+766.568192260" watchObservedRunningTime="2025-10-02 18:30:45.387373371 +0000 UTC m=+766.574869230" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.295294 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.298788 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.302512 4909 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-2xfd5" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.302990 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.303857 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.311090 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.448289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b0058b6-973b-494f-9724-9bb1760fd909\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b0058b6-973b-494f-9724-9bb1760fd909\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") " pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.448396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d7df\" (UniqueName: \"kubernetes.io/projected/04d94bb5-5d24-4b89-8ebf-8b569057c57a-kube-api-access-9d7df\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") " pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.550083 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b0058b6-973b-494f-9724-9bb1760fd909\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b0058b6-973b-494f-9724-9bb1760fd909\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") " pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.550145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d7df\" (UniqueName: \"kubernetes.io/projected/04d94bb5-5d24-4b89-8ebf-8b569057c57a-kube-api-access-9d7df\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") " pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.554854 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.554927 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b0058b6-973b-494f-9724-9bb1760fd909\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b0058b6-973b-494f-9724-9bb1760fd909\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6b8136a29e61034237e85f6e48566c17860825ab92fb7addfe8025ab049a818/globalmount\"" pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.574679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d7df\" (UniqueName: \"kubernetes.io/projected/04d94bb5-5d24-4b89-8ebf-8b569057c57a-kube-api-access-9d7df\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") " pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.584977 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b0058b6-973b-494f-9724-9bb1760fd909\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b0058b6-973b-494f-9724-9bb1760fd909\") pod \"minio\" (UID: \"04d94bb5-5d24-4b89-8ebf-8b569057c57a\") " pod="minio-dev/minio" Oct 02 18:30:49 crc kubenswrapper[4909]: I1002 18:30:49.624633 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 02 18:30:50 crc kubenswrapper[4909]: I1002 18:30:50.091460 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 02 18:30:50 crc kubenswrapper[4909]: I1002 18:30:50.371179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"04d94bb5-5d24-4b89-8ebf-8b569057c57a","Type":"ContainerStarted","Data":"418106852f03026c4cfb1f7e8cdd91ded374d9755803b8a1b9698e70067f313f"} Oct 02 18:30:53 crc kubenswrapper[4909]: I1002 18:30:53.055283 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:30:53 crc kubenswrapper[4909]: I1002 18:30:53.055813 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:30:56 crc kubenswrapper[4909]: I1002 18:30:56.420918 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"04d94bb5-5d24-4b89-8ebf-8b569057c57a","Type":"ContainerStarted","Data":"6b04ece738fc3a964fdbb551a99b36b41c2c851d803604dc1ac847fc20fa4aab"} Oct 02 18:30:56 crc kubenswrapper[4909]: I1002 18:30:56.433563 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.30307101 podStartE2EDuration="10.433534098s" podCreationTimestamp="2025-10-02 18:30:46 +0000 UTC" firstStartedPulling="2025-10-02 18:30:50.105921005 +0000 UTC m=+771.293416894" lastFinishedPulling="2025-10-02 18:30:55.236384113 +0000 UTC m=+776.423879982" observedRunningTime="2025-10-02 18:30:56.432847976 +0000 UTC m=+777.620343835" watchObservedRunningTime="2025-10-02 18:30:56.433534098 +0000 UTC m=+777.621029977" Oct 02 18:30:59 crc kubenswrapper[4909]: I1002 18:30:59.997109 4909 scope.go:117] "RemoveContainer" containerID="83d170020d3b48b19b95904733fb705249c724f2dee9f6299d72e30872769683" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.065259 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-6f5f7fff97-897dd"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.066305 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.072939 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.073582 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.073861 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-6vw6t" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.073984 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.074224 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.142118 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-6f5f7fff97-897dd"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.228787 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5d954896cf-x8snb"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.230016 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.230673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-ca-bundle\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.230723 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsgzb\" (UniqueName: \"kubernetes.io/projected/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-kube-api-access-jsgzb\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.230752 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-config\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.230776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.230822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-distributor-http\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.232742 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.232943 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.233475 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.243644 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5d954896cf-x8snb"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.286270 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.287271 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.294286 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.297014 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.310224 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332239 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-querier-grpc\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332334 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-s3\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332373 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8203105d-afce-403b-8a70-d624672e2826-logging-loki-ca-bundle\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332419 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-distributor-http\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332451 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-querier-http\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332489 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8203105d-afce-403b-8a70-d624672e2826-config\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332543 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-ca-bundle\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332583 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbbxb\" (UniqueName: \"kubernetes.io/projected/8203105d-afce-403b-8a70-d624672e2826-kube-api-access-jbbxb\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332610 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsgzb\" (UniqueName: \"kubernetes.io/projected/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-kube-api-access-jsgzb\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.332641 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-config\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.333962 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-config\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.335117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-ca-bundle\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.341447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.350833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-logging-loki-distributor-http\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.353437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsgzb\" (UniqueName: \"kubernetes.io/projected/68d54b28-bf98-4e97-a5f8-9cc1abb31a5d-kube-api-access-jsgzb\") pod \"logging-loki-distributor-6f5f7fff97-897dd\" (UID: \"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.430836 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5994fd858f-htpg9"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.432077 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.433695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-s3\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.433822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ph2p\" (UniqueName: \"kubernetes.io/projected/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-kube-api-access-4ph2p\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.433915 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8203105d-afce-403b-8a70-d624672e2826-logging-loki-ca-bundle\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.433998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434096 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-querier-http\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434236 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8203105d-afce-403b-8a70-d624672e2826-config\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbbxb\" (UniqueName: \"kubernetes.io/projected/8203105d-afce-403b-8a70-d624672e2826-kube-api-access-jbbxb\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434417 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-querier-grpc\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.434562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-config\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.437547 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.437788 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.437952 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.438131 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.438552 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-s3\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.439309 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8203105d-afce-403b-8a70-d624672e2826-logging-loki-ca-bundle\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.440190 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8203105d-afce-403b-8a70-d624672e2826-config\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.442537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-querier-grpc\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.445329 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.447895 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/8203105d-afce-403b-8a70-d624672e2826-logging-loki-querier-http\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.451203 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.472900 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5994fd858f-htpg9"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.507081 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5994fd858f-5dfzs"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.508880 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbbxb\" (UniqueName: \"kubernetes.io/projected/8203105d-afce-403b-8a70-d624672e2826-kube-api-access-jbbxb\") pod \"logging-loki-querier-5d954896cf-x8snb\" (UID: \"8203105d-afce-403b-8a70-d624672e2826\") " pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.519464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.522011 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-dkqbc" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.522383 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5994fd858f-5dfzs"] Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.536691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.537136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.537280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tenants\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.537464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjszs\" (UniqueName: \"kubernetes.io/projected/42692035-de07-49cc-b2f6-2305a4ff6f31-kube-api-access-wjszs\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.537607 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.537733 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tls-secret\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.537851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.538014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-config\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.538184 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ph2p\" (UniqueName: \"kubernetes.io/projected/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-kube-api-access-4ph2p\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.538311 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-rbac\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.538464 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.538579 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.538714 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-lokistack-gateway\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.540125 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-config\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.540768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.546850 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.547262 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.547565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.565995 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ph2p\" (UniqueName: \"kubernetes.io/projected/1ddf8d24-c907-43af-bb68-ee9a2c28fd67-kube-api-access-4ph2p\") pod \"logging-loki-query-frontend-6fbbbc8b7d-pp4hn\" (UID: \"1ddf8d24-c907-43af-bb68-ee9a2c28fd67\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.609549 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-tenants\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640438 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-rbac\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640470 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640501 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-lokistack-gateway\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-rbac\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640574 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tenants\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjszs\" (UniqueName: \"kubernetes.io/projected/42692035-de07-49cc-b2f6-2305a4ff6f31-kube-api-access-wjszs\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640692 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640726 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwssr\" (UniqueName: \"kubernetes.io/projected/1714d70d-b81b-4886-816f-da1588c7364a-kube-api-access-bwssr\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640769 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640794 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tls-secret\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640826 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-tls-secret\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640844 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.640866 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-lokistack-gateway\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: E1002 18:31:00.641580 4909 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Oct 02 18:31:00 crc kubenswrapper[4909]: E1002 18:31:00.641675 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tls-secret podName:42692035-de07-49cc-b2f6-2305a4ff6f31 nodeName:}" failed. No retries permitted until 2025-10-02 18:31:01.141647951 +0000 UTC m=+782.329143860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tls-secret") pod "logging-loki-gateway-5994fd858f-htpg9" (UID: "42692035-de07-49cc-b2f6-2305a4ff6f31") : secret "logging-loki-gateway-http" not found Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.641978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.643220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-lokistack-gateway\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.643308 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.646798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.648316 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tenants\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.652451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/42692035-de07-49cc-b2f6-2305a4ff6f31-rbac\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.666936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjszs\" (UniqueName: \"kubernetes.io/projected/42692035-de07-49cc-b2f6-2305a4ff6f31-kube-api-access-wjszs\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742378 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-lokistack-gateway\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742444 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-tenants\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-rbac\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742526 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742571 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742619 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwssr\" (UniqueName: \"kubernetes.io/projected/1714d70d-b81b-4886-816f-da1588c7364a-kube-api-access-bwssr\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742672 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-tls-secret\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.742689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.743656 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-lokistack-gateway\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.743764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.744087 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-rbac\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.744121 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.748055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-tenants\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.748294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.748760 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1714d70d-b81b-4886-816f-da1588c7364a-tls-secret\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.761402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwssr\" (UniqueName: \"kubernetes.io/projected/1714d70d-b81b-4886-816f-da1588c7364a-kube-api-access-bwssr\") pod \"logging-loki-gateway-5994fd858f-5dfzs\" (UID: \"1714d70d-b81b-4886-816f-da1588c7364a\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.854956 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:00 crc kubenswrapper[4909]: I1002 18:31:00.880863 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-6f5f7fff97-897dd"] Oct 02 18:31:00 crc kubenswrapper[4909]: W1002 18:31:00.899464 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68d54b28_bf98_4e97_a5f8_9cc1abb31a5d.slice/crio-56c984144433173a071ffc3474f3cdbf85c32d693a38df431160105865fd83d2 WatchSource:0}: Error finding container 56c984144433173a071ffc3474f3cdbf85c32d693a38df431160105865fd83d2: Status 404 returned error can't find the container with id 56c984144433173a071ffc3474f3cdbf85c32d693a38df431160105865fd83d2 Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.111892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5d954896cf-x8snb"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.114318 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5994fd858f-5dfzs"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.152169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tls-secret\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.158625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/42692035-de07-49cc-b2f6-2305a4ff6f31-tls-secret\") pod \"logging-loki-gateway-5994fd858f-htpg9\" (UID: \"42692035-de07-49cc-b2f6-2305a4ff6f31\") " pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.172611 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn"] Oct 02 18:31:01 crc kubenswrapper[4909]: W1002 18:31:01.181158 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ddf8d24_c907_43af_bb68_ee9a2c28fd67.slice/crio-40acaafaaf650b9e94c047cf6d6998176dbb711dcaa26ab8d82e789e38169efb WatchSource:0}: Error finding container 40acaafaaf650b9e94c047cf6d6998176dbb711dcaa26ab8d82e789e38169efb: Status 404 returned error can't find the container with id 40acaafaaf650b9e94c047cf6d6998176dbb711dcaa26ab8d82e789e38169efb Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.233457 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.234509 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.241491 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.241655 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.248069 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.290560 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.291737 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.294079 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.294268 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.303206 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.354400 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.354651 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmpll\" (UniqueName: \"kubernetes.io/projected/3af72dc6-9572-44cd-b4b8-cab3a6857f08-kube-api-access-gmpll\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.354759 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.354892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.355018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.355143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.355242 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.355347 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af72dc6-9572-44cd-b4b8-cab3a6857f08-config\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.389691 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.391000 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.393291 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.393375 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.396381 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.438692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.456853 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.456913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.456950 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmpll\" (UniqueName: \"kubernetes.io/projected/3af72dc6-9572-44cd-b4b8-cab3a6857f08-kube-api-access-gmpll\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.456979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457057 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457082 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457201 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457229 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4kt6\" (UniqueName: \"kubernetes.io/projected/90528ba7-f037-4809-959c-26c1a511bb84-kube-api-access-s4kt6\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90528ba7-f037-4809-959c-26c1a511bb84-config\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457289 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af72dc6-9572-44cd-b4b8-cab3a6857f08-config\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457316 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.457352 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.458418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.459083 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af72dc6-9572-44cd-b4b8-cab3a6857f08-config\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.461220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.464137 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.465098 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3af72dc6-9572-44cd-b4b8-cab3a6857f08-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.466315 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.466347 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1eb8cf328e931942d571014c872b0901287fd70e3440f19352c5539d0b08107a/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.473681 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.473722 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1374e3058d6b0e420c7a9e6cc0e26d34e2ee3f50fd5ee7ecd64c297d82300d75/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.484329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmpll\" (UniqueName: \"kubernetes.io/projected/3af72dc6-9572-44cd-b4b8-cab3a6857f08-kube-api-access-gmpll\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.494008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" event={"ID":"8203105d-afce-403b-8a70-d624672e2826","Type":"ContainerStarted","Data":"cfa0cba1cb263c1de2ad1fa27beba4d4f8d8c07fe46a978aad31e0e8df8bdbec"} Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.498937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc8c007d-0619-461e-8c6d-e0f8e5650385\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.500424 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" event={"ID":"1714d70d-b81b-4886-816f-da1588c7364a","Type":"ContainerStarted","Data":"84e7a257382496859813a65163ba90d1d079e9a77d3ef0ec39e797ef8e23d907"} Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.502495 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" event={"ID":"1ddf8d24-c907-43af-bb68-ee9a2c28fd67","Type":"ContainerStarted","Data":"40acaafaaf650b9e94c047cf6d6998176dbb711dcaa26ab8d82e789e38169efb"} Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.507097 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" event={"ID":"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d","Type":"ContainerStarted","Data":"56c984144433173a071ffc3474f3cdbf85c32d693a38df431160105865fd83d2"} Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.509549 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03bf4d7-2bef-4d27-8353-6258bcbd18f8\") pod \"logging-loki-ingester-0\" (UID: \"3af72dc6-9572-44cd-b4b8-cab3a6857f08\") " pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.549999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560184 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90528ba7-f037-4809-959c-26c1a511bb84-config\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560236 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fmt\" (UniqueName: \"kubernetes.io/projected/f384831b-fa88-451d-9e4c-3181bd39ed42-kube-api-access-d7fmt\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560507 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560574 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560617 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560658 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560786 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f384831b-fa88-451d-9e4c-3181bd39ed42-config\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560827 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560873 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.560973 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4kt6\" (UniqueName: \"kubernetes.io/projected/90528ba7-f037-4809-959c-26c1a511bb84-kube-api-access-s4kt6\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.561045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.562858 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90528ba7-f037-4809-959c-26c1a511bb84-config\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.565428 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.565455 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5f1c2bc5133af5f0d7be6f5d44d77acf8fde886b1b1c6b3036cc1454715f6970/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.566530 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.569733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.578688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4kt6\" (UniqueName: \"kubernetes.io/projected/90528ba7-f037-4809-959c-26c1a511bb84-kube-api-access-s4kt6\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.579179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.579750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/90528ba7-f037-4809-959c-26c1a511bb84-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.587170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bc45c91-2fc0-4b12-93d6-f614a3e2cb49\") pod \"logging-loki-compactor-0\" (UID: \"90528ba7-f037-4809-959c-26c1a511bb84\") " pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.603787 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662194 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662283 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f384831b-fa88-451d-9e4c-3181bd39ed42-config\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662400 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fmt\" (UniqueName: \"kubernetes.io/projected/f384831b-fa88-451d-9e4c-3181bd39ed42-kube-api-access-d7fmt\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.662460 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.663561 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.663729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f384831b-fa88-451d-9e4c-3181bd39ed42-config\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.667768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.667879 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.668610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f384831b-fa88-451d-9e4c-3181bd39ed42-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.669937 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.669972 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f00ccf1b867a6bcf0d27eacce8dd240b8edbb936a0f0e62840e9e80839a5b362/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.682527 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fmt\" (UniqueName: \"kubernetes.io/projected/f384831b-fa88-451d-9e4c-3181bd39ed42-kube-api-access-d7fmt\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.703336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa49551-3a16-4260-ae77-2fd53fbbb1f0\") pod \"logging-loki-index-gateway-0\" (UID: \"f384831b-fa88-451d-9e4c-3181bd39ed42\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.713953 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:01 crc kubenswrapper[4909]: I1002 18:31:01.847331 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5994fd858f-htpg9"] Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.013551 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.102270 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.203346 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.514839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"90528ba7-f037-4809-959c-26c1a511bb84","Type":"ContainerStarted","Data":"053fee54e2d236378c20690f23785fa9a19284ee4866294c4003004bc1d3aea5"} Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.515920 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"f384831b-fa88-451d-9e4c-3181bd39ed42","Type":"ContainerStarted","Data":"cefeddbf449da758f32de5cdcd6894b1190b2bf14a020cfbc022ee7e680831a8"} Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.517143 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" event={"ID":"42692035-de07-49cc-b2f6-2305a4ff6f31","Type":"ContainerStarted","Data":"b82bf8d6d4d2337d733e9dd3982178006bb5031cbafafde0655e48ae7dbac09d"} Oct 02 18:31:02 crc kubenswrapper[4909]: I1002 18:31:02.518474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"3af72dc6-9572-44cd-b4b8-cab3a6857f08","Type":"ContainerStarted","Data":"004d594f4cdfc537dee51b385efe1e58645d888a5d0e2e9bb67ffe1a30534c68"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.563417 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" event={"ID":"42692035-de07-49cc-b2f6-2305a4ff6f31","Type":"ContainerStarted","Data":"4526e0a6e6241202f46c56c5424cca0214c94caeef605a01c8686bf6cec50c0a"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.565853 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" event={"ID":"8203105d-afce-403b-8a70-d624672e2826","Type":"ContainerStarted","Data":"fc6a13e2b1491ce3b6cb6bd67ceabf06ad4523bcdc23bb74b6b7c0bacb97e7f9"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.565942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.568700 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"3af72dc6-9572-44cd-b4b8-cab3a6857f08","Type":"ContainerStarted","Data":"ab57e814215bb45ec58192df707da261e6f9aa851f3fde8aa07f2149bc61841b"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.568879 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.571906 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" event={"ID":"1714d70d-b81b-4886-816f-da1588c7364a","Type":"ContainerStarted","Data":"e396778cc8a9bb9db04c595eab2d228d7305ee40cf9f352c75dadb36d1f3f891"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.574104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" event={"ID":"1ddf8d24-c907-43af-bb68-ee9a2c28fd67","Type":"ContainerStarted","Data":"e3cbb61300bd78d321d1afe196ee4679c57c2d91c6c20b5d15ebba21e912def0"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.574269 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.575805 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" event={"ID":"68d54b28-bf98-4e97-a5f8-9cc1abb31a5d","Type":"ContainerStarted","Data":"fd215f8c16d728dba31fa517dfd27be074b364b645e109d048f2836db1e631e1"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.576005 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.577998 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"90528ba7-f037-4809-959c-26c1a511bb84","Type":"ContainerStarted","Data":"7721fcc260fee2b455a40de29520e56cfc47c9941d6ab5705bacbe70131863d1"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.578296 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.580586 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"f384831b-fa88-451d-9e4c-3181bd39ed42","Type":"ContainerStarted","Data":"93b26c9147a1c641f83fdafc4e0d0ebd246b3d8790d71139aef222d58ca900d7"} Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.581097 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.599899 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" podStartSLOduration=2.436803467 podStartE2EDuration="7.599873042s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:01.1136669 +0000 UTC m=+782.301162759" lastFinishedPulling="2025-10-02 18:31:06.276736475 +0000 UTC m=+787.464232334" observedRunningTime="2025-10-02 18:31:07.597118585 +0000 UTC m=+788.784614454" watchObservedRunningTime="2025-10-02 18:31:07.599873042 +0000 UTC m=+788.787368911" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.629859 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.598839976 podStartE2EDuration="7.629823346s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:02.215207752 +0000 UTC m=+783.402703611" lastFinishedPulling="2025-10-02 18:31:06.246191082 +0000 UTC m=+787.433686981" observedRunningTime="2025-10-02 18:31:07.628405912 +0000 UTC m=+788.815901781" watchObservedRunningTime="2025-10-02 18:31:07.629823346 +0000 UTC m=+788.817319245" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.660897 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" podStartSLOduration=2.391007703 podStartE2EDuration="7.660871475s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:00.905068644 +0000 UTC m=+782.092564503" lastFinishedPulling="2025-10-02 18:31:06.174932396 +0000 UTC m=+787.362428275" observedRunningTime="2025-10-02 18:31:07.654107942 +0000 UTC m=+788.841603821" watchObservedRunningTime="2025-10-02 18:31:07.660871475 +0000 UTC m=+788.848367344" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.678948 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.509465528 podStartE2EDuration="7.678923204s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:02.016224129 +0000 UTC m=+783.203720018" lastFinishedPulling="2025-10-02 18:31:06.185681825 +0000 UTC m=+787.373177694" observedRunningTime="2025-10-02 18:31:07.673637097 +0000 UTC m=+788.861132966" watchObservedRunningTime="2025-10-02 18:31:07.678923204 +0000 UTC m=+788.866419103" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.699917 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" podStartSLOduration=2.695402696 podStartE2EDuration="7.699886524s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:01.183610264 +0000 UTC m=+782.371106123" lastFinishedPulling="2025-10-02 18:31:06.188094082 +0000 UTC m=+787.375589951" observedRunningTime="2025-10-02 18:31:07.692229493 +0000 UTC m=+788.879725352" watchObservedRunningTime="2025-10-02 18:31:07.699886524 +0000 UTC m=+788.887382383" Oct 02 18:31:07 crc kubenswrapper[4909]: I1002 18:31:07.716752 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.5451274230000003 podStartE2EDuration="7.716726646s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:02.114921401 +0000 UTC m=+783.302417260" lastFinishedPulling="2025-10-02 18:31:06.286520624 +0000 UTC m=+787.474016483" observedRunningTime="2025-10-02 18:31:07.711609654 +0000 UTC m=+788.899105513" watchObservedRunningTime="2025-10-02 18:31:07.716726646 +0000 UTC m=+788.904222515" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.597950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" event={"ID":"1714d70d-b81b-4886-816f-da1588c7364a","Type":"ContainerStarted","Data":"2259f5899b370be1744bb5b5c7a400b618e68becbb98bb53e2de6b16c8ebd6ab"} Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.600153 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.600192 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.602617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" event={"ID":"42692035-de07-49cc-b2f6-2305a4ff6f31","Type":"ContainerStarted","Data":"67e5e253e37e7a5325aed90ee1d0f3d830726076d87a5a87ea342645f44755e6"} Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.603560 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.603965 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.622097 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.622188 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.622236 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.628371 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5994fd858f-5dfzs" podStartSLOduration=1.871584511 podStartE2EDuration="9.628348572s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:01.143378977 +0000 UTC m=+782.330874836" lastFinishedPulling="2025-10-02 18:31:08.900142998 +0000 UTC m=+790.087638897" observedRunningTime="2025-10-02 18:31:09.623107636 +0000 UTC m=+790.810603555" watchObservedRunningTime="2025-10-02 18:31:09.628348572 +0000 UTC m=+790.815844461" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.645929 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" Oct 02 18:31:09 crc kubenswrapper[4909]: I1002 18:31:09.659734 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5994fd858f-htpg9" podStartSLOduration=2.618926607 podStartE2EDuration="9.65970655s" podCreationTimestamp="2025-10-02 18:31:00 +0000 UTC" firstStartedPulling="2025-10-02 18:31:01.860279954 +0000 UTC m=+783.047775813" lastFinishedPulling="2025-10-02 18:31:08.901059897 +0000 UTC m=+790.088555756" observedRunningTime="2025-10-02 18:31:09.652633718 +0000 UTC m=+790.840129597" watchObservedRunningTime="2025-10-02 18:31:09.65970655 +0000 UTC m=+790.847202419" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.751937 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wbxvh"] Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.753753 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.788381 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbxvh"] Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.801137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-utilities\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.801269 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcp5\" (UniqueName: \"kubernetes.io/projected/c7fb6493-046e-458a-9b18-1c3223449c01-kube-api-access-7rcp5\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.801301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-catalog-content\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.902368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcp5\" (UniqueName: \"kubernetes.io/projected/c7fb6493-046e-458a-9b18-1c3223449c01-kube-api-access-7rcp5\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.902426 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-catalog-content\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.902486 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-utilities\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.903112 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-utilities\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.903193 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-catalog-content\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:14 crc kubenswrapper[4909]: I1002 18:31:14.944313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcp5\" (UniqueName: \"kubernetes.io/projected/c7fb6493-046e-458a-9b18-1c3223449c01-kube-api-access-7rcp5\") pod \"community-operators-wbxvh\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:15 crc kubenswrapper[4909]: I1002 18:31:15.085019 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:15 crc kubenswrapper[4909]: I1002 18:31:15.589854 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbxvh"] Oct 02 18:31:15 crc kubenswrapper[4909]: I1002 18:31:15.651011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbxvh" event={"ID":"c7fb6493-046e-458a-9b18-1c3223449c01","Type":"ContainerStarted","Data":"106bed2ca0dd4ec9e6525cf8bebf41543b24968890cf730a3d02b027d1c10c2f"} Oct 02 18:31:16 crc kubenswrapper[4909]: I1002 18:31:16.659738 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7fb6493-046e-458a-9b18-1c3223449c01" containerID="28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874" exitCode=0 Oct 02 18:31:16 crc kubenswrapper[4909]: I1002 18:31:16.659845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbxvh" event={"ID":"c7fb6493-046e-458a-9b18-1c3223449c01","Type":"ContainerDied","Data":"28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874"} Oct 02 18:31:20 crc kubenswrapper[4909]: I1002 18:31:20.700771 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7fb6493-046e-458a-9b18-1c3223449c01" containerID="9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53" exitCode=0 Oct 02 18:31:20 crc kubenswrapper[4909]: I1002 18:31:20.700835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbxvh" event={"ID":"c7fb6493-046e-458a-9b18-1c3223449c01","Type":"ContainerDied","Data":"9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53"} Oct 02 18:31:21 crc kubenswrapper[4909]: I1002 18:31:21.561879 4909 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Oct 02 18:31:21 crc kubenswrapper[4909]: I1002 18:31:21.562214 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="3af72dc6-9572-44cd-b4b8-cab3a6857f08" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 02 18:31:21 crc kubenswrapper[4909]: I1002 18:31:21.617883 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Oct 02 18:31:21 crc kubenswrapper[4909]: I1002 18:31:21.720671 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.054598 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.054958 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.055007 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.055829 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbfbb601ff2a52b11f89346581838b2c67f48bfb68a316624366c9e5079f4722"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.055896 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://bbfbb601ff2a52b11f89346581838b2c67f48bfb68a316624366c9e5079f4722" gracePeriod=600 Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.732911 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="bbfbb601ff2a52b11f89346581838b2c67f48bfb68a316624366c9e5079f4722" exitCode=0 Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.732980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"bbfbb601ff2a52b11f89346581838b2c67f48bfb68a316624366c9e5079f4722"} Oct 02 18:31:23 crc kubenswrapper[4909]: I1002 18:31:23.733074 4909 scope.go:117] "RemoveContainer" containerID="b4c495a863221a111f0536c59b76004ad7cbb71445dd30409ddb1deb091b0858" Oct 02 18:31:24 crc kubenswrapper[4909]: I1002 18:31:24.741635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"573b7146b529d3438c92c471059130780d1a5678b93eb10b123e4e2ecdc9d9a5"} Oct 02 18:31:24 crc kubenswrapper[4909]: I1002 18:31:24.743982 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbxvh" event={"ID":"c7fb6493-046e-458a-9b18-1c3223449c01","Type":"ContainerStarted","Data":"def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa"} Oct 02 18:31:24 crc kubenswrapper[4909]: I1002 18:31:24.811270 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wbxvh" podStartSLOduration=4.188398573 podStartE2EDuration="10.811248072s" podCreationTimestamp="2025-10-02 18:31:14 +0000 UTC" firstStartedPulling="2025-10-02 18:31:16.662101743 +0000 UTC m=+797.849597612" lastFinishedPulling="2025-10-02 18:31:23.284951252 +0000 UTC m=+804.472447111" observedRunningTime="2025-10-02 18:31:24.806631457 +0000 UTC m=+805.994127356" watchObservedRunningTime="2025-10-02 18:31:24.811248072 +0000 UTC m=+805.998743931" Oct 02 18:31:25 crc kubenswrapper[4909]: I1002 18:31:25.085254 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:25 crc kubenswrapper[4909]: I1002 18:31:25.085465 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:26 crc kubenswrapper[4909]: I1002 18:31:26.140374 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wbxvh" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="registry-server" probeResult="failure" output=< Oct 02 18:31:26 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:31:26 crc kubenswrapper[4909]: > Oct 02 18:31:30 crc kubenswrapper[4909]: I1002 18:31:30.462185 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-897dd" Oct 02 18:31:30 crc kubenswrapper[4909]: I1002 18:31:30.559310 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5d954896cf-x8snb" Oct 02 18:31:30 crc kubenswrapper[4909]: I1002 18:31:30.617881 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-pp4hn" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.026667 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lsnrp"] Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.028340 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.041692 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsnrp"] Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.178332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmvz\" (UniqueName: \"kubernetes.io/projected/9c010275-8a6f-4c89-a1f9-a6c65cb63253-kube-api-access-sgmvz\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.178442 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-catalog-content\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.178492 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-utilities\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.280064 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmvz\" (UniqueName: \"kubernetes.io/projected/9c010275-8a6f-4c89-a1f9-a6c65cb63253-kube-api-access-sgmvz\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.280136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-catalog-content\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.280196 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-utilities\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.280805 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-utilities\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.281064 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-catalog-content\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.304523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmvz\" (UniqueName: \"kubernetes.io/projected/9c010275-8a6f-4c89-a1f9-a6c65cb63253-kube-api-access-sgmvz\") pod \"certified-operators-lsnrp\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.387055 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.568632 4909 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.568719 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="3af72dc6-9572-44cd-b4b8-cab3a6857f08" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.672062 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsnrp"] Oct 02 18:31:31 crc kubenswrapper[4909]: I1002 18:31:31.810705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsnrp" event={"ID":"9c010275-8a6f-4c89-a1f9-a6c65cb63253","Type":"ContainerStarted","Data":"55ceb04e18719015191ea4036f454a3919ed8d4795f7c8d22df60e82bb43dda9"} Oct 02 18:31:32 crc kubenswrapper[4909]: I1002 18:31:32.818503 4909 generic.go:334] "Generic (PLEG): container finished" podID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerID="f47de1baf77b927fad2f244ce25606dc2606259f0a8e35766cccbd12d48237c8" exitCode=0 Oct 02 18:31:32 crc kubenswrapper[4909]: I1002 18:31:32.818552 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsnrp" event={"ID":"9c010275-8a6f-4c89-a1f9-a6c65cb63253","Type":"ContainerDied","Data":"f47de1baf77b927fad2f244ce25606dc2606259f0a8e35766cccbd12d48237c8"} Oct 02 18:31:33 crc kubenswrapper[4909]: I1002 18:31:33.828287 4909 generic.go:334] "Generic (PLEG): container finished" podID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerID="e344a38f7a1fe13081e1fcf00cc8b08c3b6bfe4d5faea34fd3698c3ffcadb66e" exitCode=0 Oct 02 18:31:33 crc kubenswrapper[4909]: I1002 18:31:33.828393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsnrp" event={"ID":"9c010275-8a6f-4c89-a1f9-a6c65cb63253","Type":"ContainerDied","Data":"e344a38f7a1fe13081e1fcf00cc8b08c3b6bfe4d5faea34fd3698c3ffcadb66e"} Oct 02 18:31:34 crc kubenswrapper[4909]: I1002 18:31:34.839203 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsnrp" event={"ID":"9c010275-8a6f-4c89-a1f9-a6c65cb63253","Type":"ContainerStarted","Data":"24f2247dcc33bd68d49795482a7fef0908396acf3e884d249d5d18f85aa866f3"} Oct 02 18:31:34 crc kubenswrapper[4909]: I1002 18:31:34.867468 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lsnrp" podStartSLOduration=2.38005169 podStartE2EDuration="3.867437794s" podCreationTimestamp="2025-10-02 18:31:31 +0000 UTC" firstStartedPulling="2025-10-02 18:31:32.821292507 +0000 UTC m=+814.008788386" lastFinishedPulling="2025-10-02 18:31:34.308678591 +0000 UTC m=+815.496174490" observedRunningTime="2025-10-02 18:31:34.861989372 +0000 UTC m=+816.049485231" watchObservedRunningTime="2025-10-02 18:31:34.867437794 +0000 UTC m=+816.054933653" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.133181 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.186938 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.211933 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kdqxq"] Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.214088 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.225093 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdqxq"] Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.348402 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-utilities\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.348845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5wx7\" (UniqueName: \"kubernetes.io/projected/128dc6e0-b902-4276-af8e-b5e3aa5f055e-kube-api-access-b5wx7\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.349079 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-catalog-content\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.451050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5wx7\" (UniqueName: \"kubernetes.io/projected/128dc6e0-b902-4276-af8e-b5e3aa5f055e-kube-api-access-b5wx7\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.451454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-catalog-content\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.451709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-utilities\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.452043 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-catalog-content\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.452213 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-utilities\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.480415 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5wx7\" (UniqueName: \"kubernetes.io/projected/128dc6e0-b902-4276-af8e-b5e3aa5f055e-kube-api-access-b5wx7\") pod \"redhat-marketplace-kdqxq\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:35 crc kubenswrapper[4909]: I1002 18:31:35.538044 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:36 crc kubenswrapper[4909]: I1002 18:31:36.041900 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdqxq"] Oct 02 18:31:36 crc kubenswrapper[4909]: I1002 18:31:36.856890 4909 generic.go:334] "Generic (PLEG): container finished" podID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerID="c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63" exitCode=0 Oct 02 18:31:36 crc kubenswrapper[4909]: I1002 18:31:36.856947 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdqxq" event={"ID":"128dc6e0-b902-4276-af8e-b5e3aa5f055e","Type":"ContainerDied","Data":"c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63"} Oct 02 18:31:36 crc kubenswrapper[4909]: I1002 18:31:36.857347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdqxq" event={"ID":"128dc6e0-b902-4276-af8e-b5e3aa5f055e","Type":"ContainerStarted","Data":"66718d0317c5379973f99640d75ff0ebb2010f8ffdf49d958389147e7223cc14"} Oct 02 18:31:37 crc kubenswrapper[4909]: I1002 18:31:37.603776 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbxvh"] Oct 02 18:31:37 crc kubenswrapper[4909]: I1002 18:31:37.604166 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wbxvh" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="registry-server" containerID="cri-o://def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa" gracePeriod=2 Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.622860 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.809829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-utilities\") pod \"c7fb6493-046e-458a-9b18-1c3223449c01\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.809912 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rcp5\" (UniqueName: \"kubernetes.io/projected/c7fb6493-046e-458a-9b18-1c3223449c01-kube-api-access-7rcp5\") pod \"c7fb6493-046e-458a-9b18-1c3223449c01\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.810097 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-catalog-content\") pod \"c7fb6493-046e-458a-9b18-1c3223449c01\" (UID: \"c7fb6493-046e-458a-9b18-1c3223449c01\") " Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.812013 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-utilities" (OuterVolumeSpecName: "utilities") pod "c7fb6493-046e-458a-9b18-1c3223449c01" (UID: "c7fb6493-046e-458a-9b18-1c3223449c01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.821501 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7fb6493-046e-458a-9b18-1c3223449c01-kube-api-access-7rcp5" (OuterVolumeSpecName: "kube-api-access-7rcp5") pod "c7fb6493-046e-458a-9b18-1c3223449c01" (UID: "c7fb6493-046e-458a-9b18-1c3223449c01"). InnerVolumeSpecName "kube-api-access-7rcp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.877653 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7fb6493-046e-458a-9b18-1c3223449c01" containerID="def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa" exitCode=0 Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.877729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbxvh" event={"ID":"c7fb6493-046e-458a-9b18-1c3223449c01","Type":"ContainerDied","Data":"def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa"} Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.877760 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbxvh" event={"ID":"c7fb6493-046e-458a-9b18-1c3223449c01","Type":"ContainerDied","Data":"106bed2ca0dd4ec9e6525cf8bebf41543b24968890cf730a3d02b027d1c10c2f"} Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.877762 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbxvh" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.877779 4909 scope.go:117] "RemoveContainer" containerID="def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.882160 4909 generic.go:334] "Generic (PLEG): container finished" podID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerID="d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf" exitCode=0 Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.882317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdqxq" event={"ID":"128dc6e0-b902-4276-af8e-b5e3aa5f055e","Type":"ContainerDied","Data":"d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf"} Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.903612 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7fb6493-046e-458a-9b18-1c3223449c01" (UID: "c7fb6493-046e-458a-9b18-1c3223449c01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.913356 4909 scope.go:117] "RemoveContainer" containerID="9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.916082 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.916111 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rcp5\" (UniqueName: \"kubernetes.io/projected/c7fb6493-046e-458a-9b18-1c3223449c01-kube-api-access-7rcp5\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.916125 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7fb6493-046e-458a-9b18-1c3223449c01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.949557 4909 scope.go:117] "RemoveContainer" containerID="28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.976810 4909 scope.go:117] "RemoveContainer" containerID="def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa" Oct 02 18:31:38 crc kubenswrapper[4909]: E1002 18:31:38.977461 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa\": container with ID starting with def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa not found: ID does not exist" containerID="def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.977530 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa"} err="failed to get container status \"def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa\": rpc error: code = NotFound desc = could not find container \"def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa\": container with ID starting with def2612d6cd13f34a6e7aef28bb9dda59f3f38ba9cbf1bbd015bac2232133daa not found: ID does not exist" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.977563 4909 scope.go:117] "RemoveContainer" containerID="9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53" Oct 02 18:31:38 crc kubenswrapper[4909]: E1002 18:31:38.978209 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53\": container with ID starting with 9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53 not found: ID does not exist" containerID="9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.978261 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53"} err="failed to get container status \"9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53\": rpc error: code = NotFound desc = could not find container \"9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53\": container with ID starting with 9b2dfb8fd35b4c57c58323f7192578d6fd68ff1e1759697b4c716bf280828b53 not found: ID does not exist" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.978295 4909 scope.go:117] "RemoveContainer" containerID="28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874" Oct 02 18:31:38 crc kubenswrapper[4909]: E1002 18:31:38.978818 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874\": container with ID starting with 28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874 not found: ID does not exist" containerID="28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874" Oct 02 18:31:38 crc kubenswrapper[4909]: I1002 18:31:38.978855 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874"} err="failed to get container status \"28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874\": rpc error: code = NotFound desc = could not find container \"28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874\": container with ID starting with 28db607fff6454f2aa510189834ddc771154ae8e7c7db6b44f580db9901a7874 not found: ID does not exist" Oct 02 18:31:39 crc kubenswrapper[4909]: I1002 18:31:39.237101 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbxvh"] Oct 02 18:31:39 crc kubenswrapper[4909]: I1002 18:31:39.245824 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wbxvh"] Oct 02 18:31:39 crc kubenswrapper[4909]: I1002 18:31:39.621556 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" path="/var/lib/kubelet/pods/c7fb6493-046e-458a-9b18-1c3223449c01/volumes" Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.389525 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.389962 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.458757 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.559218 4909 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.559310 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="3af72dc6-9572-44cd-b4b8-cab3a6857f08" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.910776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdqxq" event={"ID":"128dc6e0-b902-4276-af8e-b5e3aa5f055e","Type":"ContainerStarted","Data":"296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a"} Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.931880 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kdqxq" podStartSLOduration=2.695908351 podStartE2EDuration="6.931859833s" podCreationTimestamp="2025-10-02 18:31:35 +0000 UTC" firstStartedPulling="2025-10-02 18:31:36.858698281 +0000 UTC m=+818.046194140" lastFinishedPulling="2025-10-02 18:31:41.094649723 +0000 UTC m=+822.282145622" observedRunningTime="2025-10-02 18:31:41.92830376 +0000 UTC m=+823.115799629" watchObservedRunningTime="2025-10-02 18:31:41.931859833 +0000 UTC m=+823.119355702" Oct 02 18:31:41 crc kubenswrapper[4909]: I1002 18:31:41.967109 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:43 crc kubenswrapper[4909]: I1002 18:31:43.638916 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsnrp"] Oct 02 18:31:43 crc kubenswrapper[4909]: I1002 18:31:43.928892 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lsnrp" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="registry-server" containerID="cri-o://24f2247dcc33bd68d49795482a7fef0908396acf3e884d249d5d18f85aa866f3" gracePeriod=2 Oct 02 18:31:44 crc kubenswrapper[4909]: I1002 18:31:44.940552 4909 generic.go:334] "Generic (PLEG): container finished" podID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerID="24f2247dcc33bd68d49795482a7fef0908396acf3e884d249d5d18f85aa866f3" exitCode=0 Oct 02 18:31:44 crc kubenswrapper[4909]: I1002 18:31:44.940680 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsnrp" event={"ID":"9c010275-8a6f-4c89-a1f9-a6c65cb63253","Type":"ContainerDied","Data":"24f2247dcc33bd68d49795482a7fef0908396acf3e884d249d5d18f85aa866f3"} Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.079664 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.232219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-utilities\") pod \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.232415 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgmvz\" (UniqueName: \"kubernetes.io/projected/9c010275-8a6f-4c89-a1f9-a6c65cb63253-kube-api-access-sgmvz\") pod \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.232484 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-catalog-content\") pod \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\" (UID: \"9c010275-8a6f-4c89-a1f9-a6c65cb63253\") " Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.234131 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-utilities" (OuterVolumeSpecName: "utilities") pod "9c010275-8a6f-4c89-a1f9-a6c65cb63253" (UID: "9c010275-8a6f-4c89-a1f9-a6c65cb63253"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.242971 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c010275-8a6f-4c89-a1f9-a6c65cb63253-kube-api-access-sgmvz" (OuterVolumeSpecName: "kube-api-access-sgmvz") pod "9c010275-8a6f-4c89-a1f9-a6c65cb63253" (UID: "9c010275-8a6f-4c89-a1f9-a6c65cb63253"). InnerVolumeSpecName "kube-api-access-sgmvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.335504 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.335621 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgmvz\" (UniqueName: \"kubernetes.io/projected/9c010275-8a6f-4c89-a1f9-a6c65cb63253-kube-api-access-sgmvz\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.506591 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c010275-8a6f-4c89-a1f9-a6c65cb63253" (UID: "9c010275-8a6f-4c89-a1f9-a6c65cb63253"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.538615 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.539324 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.540180 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c010275-8a6f-4c89-a1f9-a6c65cb63253-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.622809 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.952157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsnrp" event={"ID":"9c010275-8a6f-4c89-a1f9-a6c65cb63253","Type":"ContainerDied","Data":"55ceb04e18719015191ea4036f454a3919ed8d4795f7c8d22df60e82bb43dda9"} Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.952709 4909 scope.go:117] "RemoveContainer" containerID="24f2247dcc33bd68d49795482a7fef0908396acf3e884d249d5d18f85aa866f3" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.952203 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsnrp" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.977878 4909 scope.go:117] "RemoveContainer" containerID="e344a38f7a1fe13081e1fcf00cc8b08c3b6bfe4d5faea34fd3698c3ffcadb66e" Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.987323 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsnrp"] Oct 02 18:31:45 crc kubenswrapper[4909]: I1002 18:31:45.997714 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lsnrp"] Oct 02 18:31:46 crc kubenswrapper[4909]: I1002 18:31:46.002384 4909 scope.go:117] "RemoveContainer" containerID="f47de1baf77b927fad2f244ce25606dc2606259f0a8e35766cccbd12d48237c8" Oct 02 18:31:47 crc kubenswrapper[4909]: I1002 18:31:47.624770 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" path="/var/lib/kubelet/pods/9c010275-8a6f-4c89-a1f9-a6c65cb63253/volumes" Oct 02 18:31:51 crc kubenswrapper[4909]: I1002 18:31:51.558091 4909 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Oct 02 18:31:51 crc kubenswrapper[4909]: I1002 18:31:51.558466 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="3af72dc6-9572-44cd-b4b8-cab3a6857f08" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 02 18:31:55 crc kubenswrapper[4909]: I1002 18:31:55.622520 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:55 crc kubenswrapper[4909]: I1002 18:31:55.689832 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdqxq"] Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.049235 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kdqxq" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="registry-server" containerID="cri-o://296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a" gracePeriod=2 Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.600545 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.631519 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-utilities\") pod \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.631738 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-catalog-content\") pod \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.631845 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5wx7\" (UniqueName: \"kubernetes.io/projected/128dc6e0-b902-4276-af8e-b5e3aa5f055e-kube-api-access-b5wx7\") pod \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\" (UID: \"128dc6e0-b902-4276-af8e-b5e3aa5f055e\") " Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.633159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-utilities" (OuterVolumeSpecName: "utilities") pod "128dc6e0-b902-4276-af8e-b5e3aa5f055e" (UID: "128dc6e0-b902-4276-af8e-b5e3aa5f055e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.638359 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128dc6e0-b902-4276-af8e-b5e3aa5f055e-kube-api-access-b5wx7" (OuterVolumeSpecName: "kube-api-access-b5wx7") pod "128dc6e0-b902-4276-af8e-b5e3aa5f055e" (UID: "128dc6e0-b902-4276-af8e-b5e3aa5f055e"). InnerVolumeSpecName "kube-api-access-b5wx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.660996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128dc6e0-b902-4276-af8e-b5e3aa5f055e" (UID: "128dc6e0-b902-4276-af8e-b5e3aa5f055e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.733509 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.733932 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5wx7\" (UniqueName: \"kubernetes.io/projected/128dc6e0-b902-4276-af8e-b5e3aa5f055e-kube-api-access-b5wx7\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:56 crc kubenswrapper[4909]: I1002 18:31:56.733948 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128dc6e0-b902-4276-af8e-b5e3aa5f055e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.060197 4909 generic.go:334] "Generic (PLEG): container finished" podID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerID="296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a" exitCode=0 Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.060252 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdqxq" event={"ID":"128dc6e0-b902-4276-af8e-b5e3aa5f055e","Type":"ContainerDied","Data":"296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a"} Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.060276 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdqxq" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.060295 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdqxq" event={"ID":"128dc6e0-b902-4276-af8e-b5e3aa5f055e","Type":"ContainerDied","Data":"66718d0317c5379973f99640d75ff0ebb2010f8ffdf49d958389147e7223cc14"} Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.060337 4909 scope.go:117] "RemoveContainer" containerID="296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.085180 4909 scope.go:117] "RemoveContainer" containerID="d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.101244 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdqxq"] Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.105495 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdqxq"] Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.124525 4909 scope.go:117] "RemoveContainer" containerID="c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.150220 4909 scope.go:117] "RemoveContainer" containerID="296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a" Oct 02 18:31:57 crc kubenswrapper[4909]: E1002 18:31:57.150778 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a\": container with ID starting with 296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a not found: ID does not exist" containerID="296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.150933 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a"} err="failed to get container status \"296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a\": rpc error: code = NotFound desc = could not find container \"296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a\": container with ID starting with 296e7e7a1c3e14315cbe28589c5335b882d0cc0119c78bbd9b3d657bd9d82c3a not found: ID does not exist" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.151118 4909 scope.go:117] "RemoveContainer" containerID="d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf" Oct 02 18:31:57 crc kubenswrapper[4909]: E1002 18:31:57.151932 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf\": container with ID starting with d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf not found: ID does not exist" containerID="d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.151970 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf"} err="failed to get container status \"d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf\": rpc error: code = NotFound desc = could not find container \"d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf\": container with ID starting with d81e91b508e57d2a823cf0fb9d4a5d2f5741cd45628eaaf9f247481231626abf not found: ID does not exist" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.151996 4909 scope.go:117] "RemoveContainer" containerID="c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63" Oct 02 18:31:57 crc kubenswrapper[4909]: E1002 18:31:57.152353 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63\": container with ID starting with c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63 not found: ID does not exist" containerID="c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.152487 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63"} err="failed to get container status \"c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63\": rpc error: code = NotFound desc = could not find container \"c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63\": container with ID starting with c539c6d388db16bf788bc6c3b69344c9cbf156ab8e40ca45cba3ba7c9fa95a63 not found: ID does not exist" Oct 02 18:31:57 crc kubenswrapper[4909]: I1002 18:31:57.619808 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" path="/var/lib/kubelet/pods/128dc6e0-b902-4276-af8e-b5e3aa5f055e/volumes" Oct 02 18:32:01 crc kubenswrapper[4909]: I1002 18:32:01.560055 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.368472 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jfrwf"] Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369231 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369252 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369271 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="extract-utilities" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369282 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="extract-utilities" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369297 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="extract-utilities" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369309 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="extract-utilities" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369330 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="extract-content" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369340 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="extract-content" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369354 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="extract-content" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369364 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="extract-content" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369382 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369392 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369408 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="extract-utilities" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369419 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="extract-utilities" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369441 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="extract-content" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369451 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="extract-content" Oct 02 18:32:18 crc kubenswrapper[4909]: E1002 18:32:18.369463 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369475 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369662 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7fb6493-046e-458a-9b18-1c3223449c01" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369686 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c010275-8a6f-4c89-a1f9-a6c65cb63253" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.369702 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="128dc6e0-b902-4276-af8e-b5e3aa5f055e" containerName="registry-server" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.371129 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.399180 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfrwf"] Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.512574 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-utilities\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.513127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-catalog-content\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.513216 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzcwp\" (UniqueName: \"kubernetes.io/projected/41ca4e2d-52fc-49a3-b39c-972d712054bb-kube-api-access-nzcwp\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.614348 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-catalog-content\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.614436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzcwp\" (UniqueName: \"kubernetes.io/projected/41ca4e2d-52fc-49a3-b39c-972d712054bb-kube-api-access-nzcwp\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.614489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-utilities\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.615458 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-utilities\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.615488 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-catalog-content\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.639948 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzcwp\" (UniqueName: \"kubernetes.io/projected/41ca4e2d-52fc-49a3-b39c-972d712054bb-kube-api-access-nzcwp\") pod \"redhat-operators-jfrwf\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:18 crc kubenswrapper[4909]: I1002 18:32:18.752426 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:19 crc kubenswrapper[4909]: I1002 18:32:19.056261 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfrwf"] Oct 02 18:32:19 crc kubenswrapper[4909]: I1002 18:32:19.238999 4909 generic.go:334] "Generic (PLEG): container finished" podID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerID="e69536c0847888333f63aebdc8cc29d6b5bff751cc8af4da672cceb2d9eeab76" exitCode=0 Oct 02 18:32:19 crc kubenswrapper[4909]: I1002 18:32:19.239079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerDied","Data":"e69536c0847888333f63aebdc8cc29d6b5bff751cc8af4da672cceb2d9eeab76"} Oct 02 18:32:19 crc kubenswrapper[4909]: I1002 18:32:19.240925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerStarted","Data":"19433ff2cbc7b7d6b1fb4618c2c2be8e917b74a949383f323e0625ab5a388cea"} Oct 02 18:32:20 crc kubenswrapper[4909]: I1002 18:32:20.248667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerStarted","Data":"41d434095015c96d5a554c509d38e3229a44142ca6d5d6c5b597e33f469010c6"} Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.259661 4909 generic.go:334] "Generic (PLEG): container finished" podID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerID="41d434095015c96d5a554c509d38e3229a44142ca6d5d6c5b597e33f469010c6" exitCode=0 Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.259734 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerDied","Data":"41d434095015c96d5a554c509d38e3229a44142ca6d5d6c5b597e33f469010c6"} Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.380241 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-4vxh5"] Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.381554 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.388536 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.389069 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.391231 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.391588 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-zzgzc" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.395267 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.397620 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-4vxh5"] Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.411078 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.517827 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-4vxh5"] Oct 02 18:32:21 crc kubenswrapper[4909]: E1002 18:32:21.519427 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-pqd86 metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-4vxh5" podUID="f43944f2-5152-4db8-84b5-2d1917885756" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-trusted-ca\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-entrypoint\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565541 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43944f2-5152-4db8-84b5-2d1917885756-tmp\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565559 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqd86\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-kube-api-access-pqd86\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565578 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565698 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-token\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565771 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-sa-token\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565823 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/f43944f2-5152-4db8-84b5-2d1917885756-datadir\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-metrics\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.565973 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config-openshift-service-cacrt\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.667517 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-trusted-ca\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.667756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-entrypoint\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.667827 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43944f2-5152-4db8-84b5-2d1917885756-tmp\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.667925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqd86\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-kube-api-access-pqd86\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.667992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.668096 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-token\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.668174 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-sa-token\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.668240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/f43944f2-5152-4db8-84b5-2d1917885756-datadir\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.668302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-metrics\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.668384 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.668459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config-openshift-service-cacrt\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.669482 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config-openshift-service-cacrt\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.669650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/f43944f2-5152-4db8-84b5-2d1917885756-datadir\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.669665 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-trusted-ca\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: E1002 18:32:21.669670 4909 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Oct 02 18:32:21 crc kubenswrapper[4909]: E1002 18:32:21.669797 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver podName:f43944f2-5152-4db8-84b5-2d1917885756 nodeName:}" failed. No retries permitted until 2025-10-02 18:32:22.169772688 +0000 UTC m=+863.357268607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver") pod "collector-4vxh5" (UID: "f43944f2-5152-4db8-84b5-2d1917885756") : secret "collector-syslog-receiver" not found Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.670312 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-entrypoint\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.670410 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.674781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-metrics\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.685937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-token\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.686167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-sa-token\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.689242 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqd86\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-kube-api-access-pqd86\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:21 crc kubenswrapper[4909]: I1002 18:32:21.688642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43944f2-5152-4db8-84b5-2d1917885756-tmp\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.176502 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.182419 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver\") pod \"collector-4vxh5\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " pod="openshift-logging/collector-4vxh5" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.266411 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-4vxh5" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.274439 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-4vxh5" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-sa-token\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378309 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-metrics\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-trusted-ca\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378451 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378488 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43944f2-5152-4db8-84b5-2d1917885756-tmp\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378523 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqd86\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-kube-api-access-pqd86\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378555 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378592 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-entrypoint\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378633 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config-openshift-service-cacrt\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378673 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-token\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/f43944f2-5152-4db8-84b5-2d1917885756-datadir\") pod \"f43944f2-5152-4db8-84b5-2d1917885756\" (UID: \"f43944f2-5152-4db8-84b5-2d1917885756\") " Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.378942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f43944f2-5152-4db8-84b5-2d1917885756-datadir" (OuterVolumeSpecName: "datadir") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.379247 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.379916 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.379997 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.380421 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config" (OuterVolumeSpecName: "config") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.381514 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-sa-token" (OuterVolumeSpecName: "sa-token") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.382000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43944f2-5152-4db8-84b5-2d1917885756-tmp" (OuterVolumeSpecName: "tmp") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.382881 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-kube-api-access-pqd86" (OuterVolumeSpecName: "kube-api-access-pqd86") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "kube-api-access-pqd86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.383556 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.383595 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-metrics" (OuterVolumeSpecName: "metrics") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.390782 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-token" (OuterVolumeSpecName: "collector-token") pod "f43944f2-5152-4db8-84b5-2d1917885756" (UID: "f43944f2-5152-4db8-84b5-2d1917885756"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480576 4909 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480629 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480654 4909 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480678 4909 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43944f2-5152-4db8-84b5-2d1917885756-tmp\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480697 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqd86\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-kube-api-access-pqd86\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480716 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480734 4909 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-entrypoint\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480753 4909 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/f43944f2-5152-4db8-84b5-2d1917885756-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480772 4909 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/f43944f2-5152-4db8-84b5-2d1917885756-collector-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480793 4909 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/f43944f2-5152-4db8-84b5-2d1917885756-datadir\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:22 crc kubenswrapper[4909]: I1002 18:32:22.480811 4909 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/f43944f2-5152-4db8-84b5-2d1917885756-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.277942 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-4vxh5" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.277990 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerStarted","Data":"3620ae93b7330a3048743041f5b8b5220c6e43032e40f9dfe9d81a2ba81f774e"} Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.339631 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jfrwf" podStartSLOduration=2.030400579 podStartE2EDuration="5.339598974s" podCreationTimestamp="2025-10-02 18:32:18 +0000 UTC" firstStartedPulling="2025-10-02 18:32:19.241110518 +0000 UTC m=+860.428606377" lastFinishedPulling="2025-10-02 18:32:22.550308903 +0000 UTC m=+863.737804772" observedRunningTime="2025-10-02 18:32:23.302007155 +0000 UTC m=+864.489503034" watchObservedRunningTime="2025-10-02 18:32:23.339598974 +0000 UTC m=+864.527094873" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.358475 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-4vxh5"] Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.368406 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-4vxh5"] Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.374539 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-t77xg"] Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.375626 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.377517 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.380076 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-zzgzc" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.380258 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.380652 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.382491 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.387059 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.390675 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-t77xg"] Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.493812 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/3903e4b2-91fd-4a38-880e-543863535cf5-sa-token\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.493877 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-collector-syslog-receiver\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.493911 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-entrypoint\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.493929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq5k2\" (UniqueName: \"kubernetes.io/projected/3903e4b2-91fd-4a38-880e-543863535cf5-kube-api-access-dq5k2\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.493954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/3903e4b2-91fd-4a38-880e-543863535cf5-datadir\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.493976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-metrics\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.494289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-collector-token\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.494377 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-config-openshift-service-cacrt\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.494446 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-trusted-ca\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.494493 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3903e4b2-91fd-4a38-880e-543863535cf5-tmp\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.494510 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-config\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.596435 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3903e4b2-91fd-4a38-880e-543863535cf5-tmp\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.596503 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-config\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.596712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/3903e4b2-91fd-4a38-880e-543863535cf5-sa-token\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.596873 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-collector-syslog-receiver\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.596992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-entrypoint\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq5k2\" (UniqueName: \"kubernetes.io/projected/3903e4b2-91fd-4a38-880e-543863535cf5-kube-api-access-dq5k2\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597234 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/3903e4b2-91fd-4a38-880e-543863535cf5-datadir\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597346 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-metrics\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/3903e4b2-91fd-4a38-880e-543863535cf5-datadir\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597616 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-collector-token\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-config-openshift-service-cacrt\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.597775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-trusted-ca\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.598139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-config\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.598766 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-entrypoint\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.599420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-config-openshift-service-cacrt\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.601149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3903e4b2-91fd-4a38-880e-543863535cf5-trusted-ca\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.603611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-collector-syslog-receiver\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.604437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-metrics\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.606334 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3903e4b2-91fd-4a38-880e-543863535cf5-tmp\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.607553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/3903e4b2-91fd-4a38-880e-543863535cf5-collector-token\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.630804 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/3903e4b2-91fd-4a38-880e-543863535cf5-sa-token\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.631440 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43944f2-5152-4db8-84b5-2d1917885756" path="/var/lib/kubelet/pods/f43944f2-5152-4db8-84b5-2d1917885756/volumes" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.638610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq5k2\" (UniqueName: \"kubernetes.io/projected/3903e4b2-91fd-4a38-880e-543863535cf5-kube-api-access-dq5k2\") pod \"collector-t77xg\" (UID: \"3903e4b2-91fd-4a38-880e-543863535cf5\") " pod="openshift-logging/collector-t77xg" Oct 02 18:32:23 crc kubenswrapper[4909]: I1002 18:32:23.700465 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-t77xg" Oct 02 18:32:24 crc kubenswrapper[4909]: I1002 18:32:24.196100 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-t77xg"] Oct 02 18:32:24 crc kubenswrapper[4909]: I1002 18:32:24.285559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-t77xg" event={"ID":"3903e4b2-91fd-4a38-880e-543863535cf5","Type":"ContainerStarted","Data":"6e5d7542e4e454e57015c073b580a38c68dc7aee819abf65713ea4f2eb66c6e8"} Oct 02 18:32:28 crc kubenswrapper[4909]: I1002 18:32:28.753307 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:28 crc kubenswrapper[4909]: I1002 18:32:28.754778 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:28 crc kubenswrapper[4909]: I1002 18:32:28.818949 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:29 crc kubenswrapper[4909]: I1002 18:32:29.381055 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:29 crc kubenswrapper[4909]: I1002 18:32:29.441252 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfrwf"] Oct 02 18:32:31 crc kubenswrapper[4909]: I1002 18:32:31.333362 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jfrwf" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="registry-server" containerID="cri-o://3620ae93b7330a3048743041f5b8b5220c6e43032e40f9dfe9d81a2ba81f774e" gracePeriod=2 Oct 02 18:32:32 crc kubenswrapper[4909]: I1002 18:32:32.344688 4909 generic.go:334] "Generic (PLEG): container finished" podID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerID="3620ae93b7330a3048743041f5b8b5220c6e43032e40f9dfe9d81a2ba81f774e" exitCode=0 Oct 02 18:32:32 crc kubenswrapper[4909]: I1002 18:32:32.344845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerDied","Data":"3620ae93b7330a3048743041f5b8b5220c6e43032e40f9dfe9d81a2ba81f774e"} Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.217729 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.358224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfrwf" event={"ID":"41ca4e2d-52fc-49a3-b39c-972d712054bb","Type":"ContainerDied","Data":"19433ff2cbc7b7d6b1fb4618c2c2be8e917b74a949383f323e0625ab5a388cea"} Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.358314 4909 scope.go:117] "RemoveContainer" containerID="3620ae93b7330a3048743041f5b8b5220c6e43032e40f9dfe9d81a2ba81f774e" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.358358 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfrwf" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.365446 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-utilities\") pod \"41ca4e2d-52fc-49a3-b39c-972d712054bb\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.365595 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-catalog-content\") pod \"41ca4e2d-52fc-49a3-b39c-972d712054bb\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.365743 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzcwp\" (UniqueName: \"kubernetes.io/projected/41ca4e2d-52fc-49a3-b39c-972d712054bb-kube-api-access-nzcwp\") pod \"41ca4e2d-52fc-49a3-b39c-972d712054bb\" (UID: \"41ca4e2d-52fc-49a3-b39c-972d712054bb\") " Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.367035 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-utilities" (OuterVolumeSpecName: "utilities") pod "41ca4e2d-52fc-49a3-b39c-972d712054bb" (UID: "41ca4e2d-52fc-49a3-b39c-972d712054bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.370641 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ca4e2d-52fc-49a3-b39c-972d712054bb-kube-api-access-nzcwp" (OuterVolumeSpecName: "kube-api-access-nzcwp") pod "41ca4e2d-52fc-49a3-b39c-972d712054bb" (UID: "41ca4e2d-52fc-49a3-b39c-972d712054bb"). InnerVolumeSpecName "kube-api-access-nzcwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.389266 4909 scope.go:117] "RemoveContainer" containerID="41d434095015c96d5a554c509d38e3229a44142ca6d5d6c5b597e33f469010c6" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.414613 4909 scope.go:117] "RemoveContainer" containerID="e69536c0847888333f63aebdc8cc29d6b5bff751cc8af4da672cceb2d9eeab76" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.455750 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41ca4e2d-52fc-49a3-b39c-972d712054bb" (UID: "41ca4e2d-52fc-49a3-b39c-972d712054bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.468131 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.468184 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca4e2d-52fc-49a3-b39c-972d712054bb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.468216 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzcwp\" (UniqueName: \"kubernetes.io/projected/41ca4e2d-52fc-49a3-b39c-972d712054bb-kube-api-access-nzcwp\") on node \"crc\" DevicePath \"\"" Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.690909 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfrwf"] Oct 02 18:32:33 crc kubenswrapper[4909]: I1002 18:32:33.725548 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jfrwf"] Oct 02 18:32:34 crc kubenswrapper[4909]: I1002 18:32:34.370486 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-t77xg" event={"ID":"3903e4b2-91fd-4a38-880e-543863535cf5","Type":"ContainerStarted","Data":"5906d8414061863aae277b927bcf60db1c9f41f84a250fe1e99e0eb95f5ef3f0"} Oct 02 18:32:34 crc kubenswrapper[4909]: I1002 18:32:34.398474 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-t77xg" podStartSLOduration=2.350618003 podStartE2EDuration="11.398452526s" podCreationTimestamp="2025-10-02 18:32:23 +0000 UTC" firstStartedPulling="2025-10-02 18:32:24.206792409 +0000 UTC m=+865.394288308" lastFinishedPulling="2025-10-02 18:32:33.254626932 +0000 UTC m=+874.442122831" observedRunningTime="2025-10-02 18:32:34.391923645 +0000 UTC m=+875.579419584" watchObservedRunningTime="2025-10-02 18:32:34.398452526 +0000 UTC m=+875.585948395" Oct 02 18:32:35 crc kubenswrapper[4909]: I1002 18:32:35.630356 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" path="/var/lib/kubelet/pods/41ca4e2d-52fc-49a3-b39c-972d712054bb/volumes" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.268670 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq"] Oct 02 18:32:57 crc kubenswrapper[4909]: E1002 18:32:57.269375 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="extract-utilities" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.269388 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="extract-utilities" Oct 02 18:32:57 crc kubenswrapper[4909]: E1002 18:32:57.269412 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="registry-server" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.269419 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="registry-server" Oct 02 18:32:57 crc kubenswrapper[4909]: E1002 18:32:57.269431 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="extract-content" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.269437 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="extract-content" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.269550 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ca4e2d-52fc-49a3-b39c-972d712054bb" containerName="registry-server" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.270611 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.272988 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.284500 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq"] Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.381430 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.381605 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.381926 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbnh\" (UniqueName: \"kubernetes.io/projected/c49d92a4-6655-4c61-b381-b8d52b36399b-kube-api-access-lpbnh\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.483264 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.483385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.483443 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbnh\" (UniqueName: \"kubernetes.io/projected/c49d92a4-6655-4c61-b381-b8d52b36399b-kube-api-access-lpbnh\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.484114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.484201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.519247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbnh\" (UniqueName: \"kubernetes.io/projected/c49d92a4-6655-4c61-b381-b8d52b36399b-kube-api-access-lpbnh\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:57 crc kubenswrapper[4909]: I1002 18:32:57.590561 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:32:58 crc kubenswrapper[4909]: I1002 18:32:58.089089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq"] Oct 02 18:32:58 crc kubenswrapper[4909]: I1002 18:32:58.583522 4909 generic.go:334] "Generic (PLEG): container finished" podID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerID="56b55917e123a6b5cbe587f22a83df3688a8bc774a1be37b45e505e76b115e80" exitCode=0 Oct 02 18:32:58 crc kubenswrapper[4909]: I1002 18:32:58.583993 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" event={"ID":"c49d92a4-6655-4c61-b381-b8d52b36399b","Type":"ContainerDied","Data":"56b55917e123a6b5cbe587f22a83df3688a8bc774a1be37b45e505e76b115e80"} Oct 02 18:32:58 crc kubenswrapper[4909]: I1002 18:32:58.584049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" event={"ID":"c49d92a4-6655-4c61-b381-b8d52b36399b","Type":"ContainerStarted","Data":"26afa0724e30708a9ff43ac688ebbe20a99c509d010926e4262d2068e01697e1"} Oct 02 18:33:00 crc kubenswrapper[4909]: I1002 18:33:00.604658 4909 generic.go:334] "Generic (PLEG): container finished" podID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerID="69ccb7980c4f6b1dfad3b9d5d3caf8e4d6e479af16ce5aea9e25271b1af93832" exitCode=0 Oct 02 18:33:00 crc kubenswrapper[4909]: I1002 18:33:00.604754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" event={"ID":"c49d92a4-6655-4c61-b381-b8d52b36399b","Type":"ContainerDied","Data":"69ccb7980c4f6b1dfad3b9d5d3caf8e4d6e479af16ce5aea9e25271b1af93832"} Oct 02 18:33:02 crc kubenswrapper[4909]: I1002 18:33:02.628635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" event={"ID":"c49d92a4-6655-4c61-b381-b8d52b36399b","Type":"ContainerStarted","Data":"44cd5a7638a73e7ce68d4fdf631b25e39a3a67c327e51e6eb150f969d729917e"} Oct 02 18:33:03 crc kubenswrapper[4909]: I1002 18:33:03.639687 4909 generic.go:334] "Generic (PLEG): container finished" podID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerID="44cd5a7638a73e7ce68d4fdf631b25e39a3a67c327e51e6eb150f969d729917e" exitCode=0 Oct 02 18:33:03 crc kubenswrapper[4909]: I1002 18:33:03.639752 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" event={"ID":"c49d92a4-6655-4c61-b381-b8d52b36399b","Type":"ContainerDied","Data":"44cd5a7638a73e7ce68d4fdf631b25e39a3a67c327e51e6eb150f969d729917e"} Oct 02 18:33:04 crc kubenswrapper[4909]: I1002 18:33:04.972536 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.109767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-util\") pod \"c49d92a4-6655-4c61-b381-b8d52b36399b\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.109899 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-bundle\") pod \"c49d92a4-6655-4c61-b381-b8d52b36399b\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.110059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpbnh\" (UniqueName: \"kubernetes.io/projected/c49d92a4-6655-4c61-b381-b8d52b36399b-kube-api-access-lpbnh\") pod \"c49d92a4-6655-4c61-b381-b8d52b36399b\" (UID: \"c49d92a4-6655-4c61-b381-b8d52b36399b\") " Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.110694 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-bundle" (OuterVolumeSpecName: "bundle") pod "c49d92a4-6655-4c61-b381-b8d52b36399b" (UID: "c49d92a4-6655-4c61-b381-b8d52b36399b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.121539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49d92a4-6655-4c61-b381-b8d52b36399b-kube-api-access-lpbnh" (OuterVolumeSpecName: "kube-api-access-lpbnh") pod "c49d92a4-6655-4c61-b381-b8d52b36399b" (UID: "c49d92a4-6655-4c61-b381-b8d52b36399b"). InnerVolumeSpecName "kube-api-access-lpbnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.125635 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-util" (OuterVolumeSpecName: "util") pod "c49d92a4-6655-4c61-b381-b8d52b36399b" (UID: "c49d92a4-6655-4c61-b381-b8d52b36399b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.211937 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-util\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.211983 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49d92a4-6655-4c61-b381-b8d52b36399b-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.211998 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpbnh\" (UniqueName: \"kubernetes.io/projected/c49d92a4-6655-4c61-b381-b8d52b36399b-kube-api-access-lpbnh\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.657314 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" event={"ID":"c49d92a4-6655-4c61-b381-b8d52b36399b","Type":"ContainerDied","Data":"26afa0724e30708a9ff43ac688ebbe20a99c509d010926e4262d2068e01697e1"} Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.657365 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26afa0724e30708a9ff43ac688ebbe20a99c509d010926e4262d2068e01697e1" Oct 02 18:33:05 crc kubenswrapper[4909]: I1002 18:33:05.657495 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.025498 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp"] Oct 02 18:33:09 crc kubenswrapper[4909]: E1002 18:33:09.025978 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="util" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.025990 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="util" Oct 02 18:33:09 crc kubenswrapper[4909]: E1002 18:33:09.026003 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="extract" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.026008 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="extract" Oct 02 18:33:09 crc kubenswrapper[4909]: E1002 18:33:09.026019 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="pull" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.026041 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="pull" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.026164 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49d92a4-6655-4c61-b381-b8d52b36399b" containerName="extract" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.026668 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.028685 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.029460 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6lktn" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.029460 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.041952 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp"] Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.173352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftzm\" (UniqueName: \"kubernetes.io/projected/2c0fec14-d9a3-4b4d-96dd-a22938d3c736-kube-api-access-xftzm\") pod \"nmstate-operator-858ddd8f98-vlhrp\" (UID: \"2c0fec14-d9a3-4b4d-96dd-a22938d3c736\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.275016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftzm\" (UniqueName: \"kubernetes.io/projected/2c0fec14-d9a3-4b4d-96dd-a22938d3c736-kube-api-access-xftzm\") pod \"nmstate-operator-858ddd8f98-vlhrp\" (UID: \"2c0fec14-d9a3-4b4d-96dd-a22938d3c736\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.306246 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftzm\" (UniqueName: \"kubernetes.io/projected/2c0fec14-d9a3-4b4d-96dd-a22938d3c736-kube-api-access-xftzm\") pod \"nmstate-operator-858ddd8f98-vlhrp\" (UID: \"2c0fec14-d9a3-4b4d-96dd-a22938d3c736\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.341376 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.585922 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp"] Oct 02 18:33:09 crc kubenswrapper[4909]: I1002 18:33:09.683334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" event={"ID":"2c0fec14-d9a3-4b4d-96dd-a22938d3c736","Type":"ContainerStarted","Data":"2a4e85856275e14f5cb0b744b656a65177a4b0c5da853f80ae142dbf02b753e9"} Oct 02 18:33:13 crc kubenswrapper[4909]: I1002 18:33:13.721386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" event={"ID":"2c0fec14-d9a3-4b4d-96dd-a22938d3c736","Type":"ContainerStarted","Data":"70d6cb7008f8324952a7f35e1515f5b522531b0fe1c88a185b064e4af07071fd"} Oct 02 18:33:13 crc kubenswrapper[4909]: I1002 18:33:13.752107 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vlhrp" podStartSLOduration=1.105914652 podStartE2EDuration="4.752058507s" podCreationTimestamp="2025-10-02 18:33:09 +0000 UTC" firstStartedPulling="2025-10-02 18:33:09.605175479 +0000 UTC m=+910.792671328" lastFinishedPulling="2025-10-02 18:33:13.251319274 +0000 UTC m=+914.438815183" observedRunningTime="2025-10-02 18:33:13.747675282 +0000 UTC m=+914.935171231" watchObservedRunningTime="2025-10-02 18:33:13.752058507 +0000 UTC m=+914.939554376" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.445708 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.447262 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.449111 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2n5kp" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.463407 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.472747 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.474532 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.479142 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.479886 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gdng9"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.480720 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.521521 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.536089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6wff\" (UniqueName: \"kubernetes.io/projected/f283eb50-d6fb-464c-a075-2e79f4d56305-kube-api-access-q6wff\") pod \"nmstate-metrics-fdff9cb8d-gts4l\" (UID: \"f283eb50-d6fb-464c-a075-2e79f4d56305\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.592235 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.593329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.596659 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.596921 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.597116 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ph2pg" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.608627 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.639893 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.639996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkqdr\" (UniqueName: \"kubernetes.io/projected/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-kube-api-access-tkqdr\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.640043 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-nmstate-lock\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.640080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-ovs-socket\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.640112 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbh6\" (UniqueName: \"kubernetes.io/projected/416821c4-4252-4c41-9e8d-5bf7689aae61-kube-api-access-8wbh6\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.640152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6wff\" (UniqueName: \"kubernetes.io/projected/f283eb50-d6fb-464c-a075-2e79f4d56305-kube-api-access-q6wff\") pod \"nmstate-metrics-fdff9cb8d-gts4l\" (UID: \"f283eb50-d6fb-464c-a075-2e79f4d56305\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.640189 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-dbus-socket\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.672906 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6wff\" (UniqueName: \"kubernetes.io/projected/f283eb50-d6fb-464c-a075-2e79f4d56305-kube-api-access-q6wff\") pod \"nmstate-metrics-fdff9cb8d-gts4l\" (UID: \"f283eb50-d6fb-464c-a075-2e79f4d56305\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.741977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742146 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n9r\" (UniqueName: \"kubernetes.io/projected/a0702a66-247b-4591-95cf-2ee69ffb5473-kube-api-access-g9n9r\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742216 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0702a66-247b-4591-95cf-2ee69ffb5473-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: E1002 18:33:18.742221 4909 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742248 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkqdr\" (UniqueName: \"kubernetes.io/projected/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-kube-api-access-tkqdr\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-nmstate-lock\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: E1002 18:33:18.742308 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-tls-key-pair podName:e036fd92-1027-4ac0-bde4-1c69c7fa7d4c nodeName:}" failed. No retries permitted until 2025-10-02 18:33:19.242283494 +0000 UTC m=+920.429779433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-tls-key-pair") pod "nmstate-webhook-6cdbc54649-mfztc" (UID: "e036fd92-1027-4ac0-bde4-1c69c7fa7d4c") : secret "openshift-nmstate-webhook" not found Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-nmstate-lock\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742381 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a0702a66-247b-4591-95cf-2ee69ffb5473-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-ovs-socket\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbh6\" (UniqueName: \"kubernetes.io/projected/416821c4-4252-4c41-9e8d-5bf7689aae61-kube-api-access-8wbh6\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.742670 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-dbus-socket\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.743068 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-ovs-socket\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.744329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/416821c4-4252-4c41-9e8d-5bf7689aae61-dbus-socket\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.772422 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.786792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbh6\" (UniqueName: \"kubernetes.io/projected/416821c4-4252-4c41-9e8d-5bf7689aae61-kube-api-access-8wbh6\") pod \"nmstate-handler-gdng9\" (UID: \"416821c4-4252-4c41-9e8d-5bf7689aae61\") " pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.797061 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkqdr\" (UniqueName: \"kubernetes.io/projected/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-kube-api-access-tkqdr\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.808417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.844225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n9r\" (UniqueName: \"kubernetes.io/projected/a0702a66-247b-4591-95cf-2ee69ffb5473-kube-api-access-g9n9r\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.844284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0702a66-247b-4591-95cf-2ee69ffb5473-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.844331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a0702a66-247b-4591-95cf-2ee69ffb5473-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: E1002 18:33:18.845131 4909 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 02 18:33:18 crc kubenswrapper[4909]: E1002 18:33:18.845185 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0702a66-247b-4591-95cf-2ee69ffb5473-plugin-serving-cert podName:a0702a66-247b-4591-95cf-2ee69ffb5473 nodeName:}" failed. No retries permitted until 2025-10-02 18:33:19.345169577 +0000 UTC m=+920.532665436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a0702a66-247b-4591-95cf-2ee69ffb5473-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-s4vhz" (UID: "a0702a66-247b-4591-95cf-2ee69ffb5473") : secret "plugin-serving-cert" not found Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.845841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a0702a66-247b-4591-95cf-2ee69ffb5473-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.853832 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-b6c9598fb-p2t8g"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.854642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.864423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n9r\" (UniqueName: \"kubernetes.io/projected/a0702a66-247b-4591-95cf-2ee69ffb5473-kube-api-access-g9n9r\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.881218 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6c9598fb-p2t8g"] Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.945647 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjc2g\" (UniqueName: \"kubernetes.io/projected/2806f9d6-de1a-4c32-bec7-c06e06884bd5-kube-api-access-wjc2g\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.945987 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-config\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.946055 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-trusted-ca-bundle\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.946105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-oauth-config\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.946156 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-oauth-serving-cert\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.946221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-serving-cert\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:18 crc kubenswrapper[4909]: I1002 18:33:18.946258 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-service-ca\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-config\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047690 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-trusted-ca-bundle\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047724 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-oauth-config\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047759 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-oauth-serving-cert\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047816 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-serving-cert\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-service-ca\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.047897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjc2g\" (UniqueName: \"kubernetes.io/projected/2806f9d6-de1a-4c32-bec7-c06e06884bd5-kube-api-access-wjc2g\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.048578 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-config\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.049859 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-oauth-serving-cert\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.049875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-service-ca\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.049975 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-trusted-ca-bundle\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.053093 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-serving-cert\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.055290 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-oauth-config\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.067772 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjc2g\" (UniqueName: \"kubernetes.io/projected/2806f9d6-de1a-4c32-bec7-c06e06884bd5-kube-api-access-wjc2g\") pod \"console-b6c9598fb-p2t8g\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.197374 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.251173 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.255069 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e036fd92-1027-4ac0-bde4-1c69c7fa7d4c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-mfztc\" (UID: \"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.297146 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l"] Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.352934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0702a66-247b-4591-95cf-2ee69ffb5473-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.356248 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0702a66-247b-4591-95cf-2ee69ffb5473-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-s4vhz\" (UID: \"a0702a66-247b-4591-95cf-2ee69ffb5473\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.395670 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.509517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.590997 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc"] Oct 02 18:33:19 crc kubenswrapper[4909]: W1002 18:33:19.602213 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode036fd92_1027_4ac0_bde4_1c69c7fa7d4c.slice/crio-bae67a9a7ad4a336bec6d03061e0032544d2da360fad1c0cb6a5d0761c18a183 WatchSource:0}: Error finding container bae67a9a7ad4a336bec6d03061e0032544d2da360fad1c0cb6a5d0761c18a183: Status 404 returned error can't find the container with id bae67a9a7ad4a336bec6d03061e0032544d2da360fad1c0cb6a5d0761c18a183 Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.666720 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6c9598fb-p2t8g"] Oct 02 18:33:19 crc kubenswrapper[4909]: W1002 18:33:19.673398 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2806f9d6_de1a_4c32_bec7_c06e06884bd5.slice/crio-8c4dc6d2bef4fd35fb03c5caefe247527bc36d6d3b92f5e6f1f7de3a9b4d67ff WatchSource:0}: Error finding container 8c4dc6d2bef4fd35fb03c5caefe247527bc36d6d3b92f5e6f1f7de3a9b4d67ff: Status 404 returned error can't find the container with id 8c4dc6d2bef4fd35fb03c5caefe247527bc36d6d3b92f5e6f1f7de3a9b4d67ff Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.782046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" event={"ID":"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c","Type":"ContainerStarted","Data":"bae67a9a7ad4a336bec6d03061e0032544d2da360fad1c0cb6a5d0761c18a183"} Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.785240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gdng9" event={"ID":"416821c4-4252-4c41-9e8d-5bf7689aae61","Type":"ContainerStarted","Data":"880781b7536ce8352d4b23007acb8eb52274bdc21ff71f78db047ce027a048a6"} Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.786879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c9598fb-p2t8g" event={"ID":"2806f9d6-de1a-4c32-bec7-c06e06884bd5","Type":"ContainerStarted","Data":"8c4dc6d2bef4fd35fb03c5caefe247527bc36d6d3b92f5e6f1f7de3a9b4d67ff"} Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.788826 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" event={"ID":"f283eb50-d6fb-464c-a075-2e79f4d56305","Type":"ContainerStarted","Data":"c57051f22c72e4cc06283ccf645b04b569be7579910a6cd02699d77223e318e2"} Oct 02 18:33:19 crc kubenswrapper[4909]: I1002 18:33:19.931360 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz"] Oct 02 18:33:20 crc kubenswrapper[4909]: I1002 18:33:20.813136 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c9598fb-p2t8g" event={"ID":"2806f9d6-de1a-4c32-bec7-c06e06884bd5","Type":"ContainerStarted","Data":"61ada46d0fb7375d25a32ca0f28d75d250d69abf72e3b72b1a834a7ea297c9dd"} Oct 02 18:33:20 crc kubenswrapper[4909]: I1002 18:33:20.816006 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" event={"ID":"a0702a66-247b-4591-95cf-2ee69ffb5473","Type":"ContainerStarted","Data":"5c11c70ab39f04db754b688e91a3e6ecfb1c32e056a30c6916e7a37513e9e891"} Oct 02 18:33:20 crc kubenswrapper[4909]: I1002 18:33:20.845310 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b6c9598fb-p2t8g" podStartSLOduration=2.84529122 podStartE2EDuration="2.84529122s" podCreationTimestamp="2025-10-02 18:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:33:20.842270837 +0000 UTC m=+922.029766716" watchObservedRunningTime="2025-10-02 18:33:20.84529122 +0000 UTC m=+922.032787079" Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.408834 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.841631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" event={"ID":"a0702a66-247b-4591-95cf-2ee69ffb5473","Type":"ContainerStarted","Data":"80dc97ebf9cbd934ad3ec14771ebc28de189f577df52ada82f0d0d121a2ba842"} Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.843363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" event={"ID":"f283eb50-d6fb-464c-a075-2e79f4d56305","Type":"ContainerStarted","Data":"f7a72ca2601f6ceca937626c0792c26c15f77d5a8ea58d6ce89a18859c1f7456"} Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.845460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" event={"ID":"e036fd92-1027-4ac0-bde4-1c69c7fa7d4c","Type":"ContainerStarted","Data":"dbe492758da72653ae7488c10b919364f73b71017d9e35e6a1b1ca67e195eddc"} Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.845638 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.847242 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gdng9" event={"ID":"416821c4-4252-4c41-9e8d-5bf7689aae61","Type":"ContainerStarted","Data":"920b9626d9f13aa925c42e1a6f125204614b32ac7efb4f9dedfb9acc19728e80"} Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.847350 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.858806 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-s4vhz" podStartSLOduration=2.66730344 podStartE2EDuration="5.858792345s" podCreationTimestamp="2025-10-02 18:33:18 +0000 UTC" firstStartedPulling="2025-10-02 18:33:19.953188838 +0000 UTC m=+921.140684717" lastFinishedPulling="2025-10-02 18:33:23.144677763 +0000 UTC m=+924.332173622" observedRunningTime="2025-10-02 18:33:23.858322391 +0000 UTC m=+925.045818250" watchObservedRunningTime="2025-10-02 18:33:23.858792345 +0000 UTC m=+925.046288204" Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.882413 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gdng9" podStartSLOduration=1.568121263 podStartE2EDuration="5.882393854s" podCreationTimestamp="2025-10-02 18:33:18 +0000 UTC" firstStartedPulling="2025-10-02 18:33:18.84268181 +0000 UTC m=+920.030177669" lastFinishedPulling="2025-10-02 18:33:23.156954401 +0000 UTC m=+924.344450260" observedRunningTime="2025-10-02 18:33:23.880692171 +0000 UTC m=+925.068188030" watchObservedRunningTime="2025-10-02 18:33:23.882393854 +0000 UTC m=+925.069889713" Oct 02 18:33:23 crc kubenswrapper[4909]: I1002 18:33:23.906099 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" podStartSLOduration=2.369083935 podStartE2EDuration="5.906075684s" podCreationTimestamp="2025-10-02 18:33:18 +0000 UTC" firstStartedPulling="2025-10-02 18:33:19.60984725 +0000 UTC m=+920.797343109" lastFinishedPulling="2025-10-02 18:33:23.146838999 +0000 UTC m=+924.334334858" observedRunningTime="2025-10-02 18:33:23.90467875 +0000 UTC m=+925.092174679" watchObservedRunningTime="2025-10-02 18:33:23.906075684 +0000 UTC m=+925.093571543" Oct 02 18:33:26 crc kubenswrapper[4909]: I1002 18:33:26.871442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" event={"ID":"f283eb50-d6fb-464c-a075-2e79f4d56305","Type":"ContainerStarted","Data":"0c11fefb76d946a396ba7a9ce60af571beb7a4d84c51c7dc847e17d9b0631473"} Oct 02 18:33:27 crc kubenswrapper[4909]: I1002 18:33:27.901230 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gts4l" podStartSLOduration=2.663364188 podStartE2EDuration="9.901210492s" podCreationTimestamp="2025-10-02 18:33:18 +0000 UTC" firstStartedPulling="2025-10-02 18:33:19.357317071 +0000 UTC m=+920.544812930" lastFinishedPulling="2025-10-02 18:33:26.595163355 +0000 UTC m=+927.782659234" observedRunningTime="2025-10-02 18:33:27.897588001 +0000 UTC m=+929.085083890" watchObservedRunningTime="2025-10-02 18:33:27.901210492 +0000 UTC m=+929.088706361" Oct 02 18:33:28 crc kubenswrapper[4909]: I1002 18:33:28.850142 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gdng9" Oct 02 18:33:29 crc kubenswrapper[4909]: I1002 18:33:29.198858 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:29 crc kubenswrapper[4909]: I1002 18:33:29.198942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:29 crc kubenswrapper[4909]: I1002 18:33:29.207459 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:29 crc kubenswrapper[4909]: I1002 18:33:29.899721 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:33:29 crc kubenswrapper[4909]: I1002 18:33:29.955898 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p4h8w"] Oct 02 18:33:39 crc kubenswrapper[4909]: I1002 18:33:39.407269 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-mfztc" Oct 02 18:33:53 crc kubenswrapper[4909]: I1002 18:33:53.054567 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:33:53 crc kubenswrapper[4909]: I1002 18:33:53.055217 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.010088 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-p4h8w" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerName="console" containerID="cri-o://4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe" gracePeriod=15 Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.470843 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p4h8w_2d362a1c-75bb-4778-a58b-cf43e02bac6b/console/0.log" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.471788 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.631618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-service-ca\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.631680 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhqlj\" (UniqueName: \"kubernetes.io/projected/2d362a1c-75bb-4778-a58b-cf43e02bac6b-kube-api-access-vhqlj\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.631727 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-config\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.631761 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-serving-cert\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.631797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-oauth-serving-cert\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.631865 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-oauth-config\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.632767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.632904 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-trusted-ca-bundle\") pod \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\" (UID: \"2d362a1c-75bb-4778-a58b-cf43e02bac6b\") " Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.632901 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-config" (OuterVolumeSpecName: "console-config") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.633524 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-service-ca" (OuterVolumeSpecName: "service-ca") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.634703 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.636071 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.636097 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.636115 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.636133 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d362a1c-75bb-4778-a58b-cf43e02bac6b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.641929 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.642627 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d362a1c-75bb-4778-a58b-cf43e02bac6b-kube-api-access-vhqlj" (OuterVolumeSpecName: "kube-api-access-vhqlj") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "kube-api-access-vhqlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.650022 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2d362a1c-75bb-4778-a58b-cf43e02bac6b" (UID: "2d362a1c-75bb-4778-a58b-cf43e02bac6b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.738191 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhqlj\" (UniqueName: \"kubernetes.io/projected/2d362a1c-75bb-4778-a58b-cf43e02bac6b-kube-api-access-vhqlj\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.738232 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:55 crc kubenswrapper[4909]: I1002 18:33:55.738249 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d362a1c-75bb-4778-a58b-cf43e02bac6b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.142808 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p4h8w_2d362a1c-75bb-4778-a58b-cf43e02bac6b/console/0.log" Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.142876 4909 generic.go:334] "Generic (PLEG): container finished" podID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerID="4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe" exitCode=2 Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.142915 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p4h8w" event={"ID":"2d362a1c-75bb-4778-a58b-cf43e02bac6b","Type":"ContainerDied","Data":"4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe"} Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.142949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p4h8w" event={"ID":"2d362a1c-75bb-4778-a58b-cf43e02bac6b","Type":"ContainerDied","Data":"47f8add027b46c9823b26e3a86b9faa02aad2a7fac54041a35fd289a1f7807e4"} Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.142970 4909 scope.go:117] "RemoveContainer" containerID="4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe" Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.143072 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p4h8w" Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.185788 4909 scope.go:117] "RemoveContainer" containerID="4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe" Oct 02 18:33:56 crc kubenswrapper[4909]: E1002 18:33:56.187628 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe\": container with ID starting with 4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe not found: ID does not exist" containerID="4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe" Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.187685 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe"} err="failed to get container status \"4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe\": rpc error: code = NotFound desc = could not find container \"4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe\": container with ID starting with 4edc6aace02471b594072dfa870976c872794e650deeff5fa3fe2f1597fa84fe not found: ID does not exist" Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.208477 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p4h8w"] Oct 02 18:33:56 crc kubenswrapper[4909]: I1002 18:33:56.216243 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-p4h8w"] Oct 02 18:33:57 crc kubenswrapper[4909]: I1002 18:33:57.627532 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" path="/var/lib/kubelet/pods/2d362a1c-75bb-4778-a58b-cf43e02bac6b/volumes" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.156726 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8"] Oct 02 18:34:00 crc kubenswrapper[4909]: E1002 18:34:00.158158 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerName="console" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.158260 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerName="console" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.158471 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d362a1c-75bb-4778-a58b-cf43e02bac6b" containerName="console" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.159551 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.164745 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8"] Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.183659 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.319723 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.320149 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.320313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnrdc\" (UniqueName: \"kubernetes.io/projected/c68c1569-5f7c-405d-8179-795ab29eb2b4-kube-api-access-tnrdc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.422309 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.422577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.422692 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnrdc\" (UniqueName: \"kubernetes.io/projected/c68c1569-5f7c-405d-8179-795ab29eb2b4-kube-api-access-tnrdc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.422429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.422838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.443362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnrdc\" (UniqueName: \"kubernetes.io/projected/c68c1569-5f7c-405d-8179-795ab29eb2b4-kube-api-access-tnrdc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.479455 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:00 crc kubenswrapper[4909]: I1002 18:34:00.925833 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8"] Oct 02 18:34:00 crc kubenswrapper[4909]: W1002 18:34:00.935757 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68c1569_5f7c_405d_8179_795ab29eb2b4.slice/crio-01c499ca2de6bd5cfa271937e68db7f2855cf9448362e745081a2510a790eb5e WatchSource:0}: Error finding container 01c499ca2de6bd5cfa271937e68db7f2855cf9448362e745081a2510a790eb5e: Status 404 returned error can't find the container with id 01c499ca2de6bd5cfa271937e68db7f2855cf9448362e745081a2510a790eb5e Oct 02 18:34:01 crc kubenswrapper[4909]: I1002 18:34:01.193556 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" event={"ID":"c68c1569-5f7c-405d-8179-795ab29eb2b4","Type":"ContainerStarted","Data":"b0ad38b9732ae1034cb2ef2a2540c4d22af35dc843255456746a6ded14e22ae5"} Oct 02 18:34:01 crc kubenswrapper[4909]: I1002 18:34:01.193611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" event={"ID":"c68c1569-5f7c-405d-8179-795ab29eb2b4","Type":"ContainerStarted","Data":"01c499ca2de6bd5cfa271937e68db7f2855cf9448362e745081a2510a790eb5e"} Oct 02 18:34:02 crc kubenswrapper[4909]: I1002 18:34:02.205768 4909 generic.go:334] "Generic (PLEG): container finished" podID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerID="b0ad38b9732ae1034cb2ef2a2540c4d22af35dc843255456746a6ded14e22ae5" exitCode=0 Oct 02 18:34:02 crc kubenswrapper[4909]: I1002 18:34:02.205832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" event={"ID":"c68c1569-5f7c-405d-8179-795ab29eb2b4","Type":"ContainerDied","Data":"b0ad38b9732ae1034cb2ef2a2540c4d22af35dc843255456746a6ded14e22ae5"} Oct 02 18:34:05 crc kubenswrapper[4909]: I1002 18:34:05.236918 4909 generic.go:334] "Generic (PLEG): container finished" podID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerID="4bd440422b90cc118d20f8f2abda91788e8fa7a976ea2644dfbebe9586df5d51" exitCode=0 Oct 02 18:34:05 crc kubenswrapper[4909]: I1002 18:34:05.237066 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" event={"ID":"c68c1569-5f7c-405d-8179-795ab29eb2b4","Type":"ContainerDied","Data":"4bd440422b90cc118d20f8f2abda91788e8fa7a976ea2644dfbebe9586df5d51"} Oct 02 18:34:06 crc kubenswrapper[4909]: I1002 18:34:06.248051 4909 generic.go:334] "Generic (PLEG): container finished" podID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerID="e028a2f41be848be37ef93c6a7c01167b21c9a3f143d395b4dbd3b0ed662ac1c" exitCode=0 Oct 02 18:34:06 crc kubenswrapper[4909]: I1002 18:34:06.248166 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" event={"ID":"c68c1569-5f7c-405d-8179-795ab29eb2b4","Type":"ContainerDied","Data":"e028a2f41be848be37ef93c6a7c01167b21c9a3f143d395b4dbd3b0ed662ac1c"} Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.559804 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.649453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-bundle\") pod \"c68c1569-5f7c-405d-8179-795ab29eb2b4\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.649544 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnrdc\" (UniqueName: \"kubernetes.io/projected/c68c1569-5f7c-405d-8179-795ab29eb2b4-kube-api-access-tnrdc\") pod \"c68c1569-5f7c-405d-8179-795ab29eb2b4\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.649626 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-util\") pod \"c68c1569-5f7c-405d-8179-795ab29eb2b4\" (UID: \"c68c1569-5f7c-405d-8179-795ab29eb2b4\") " Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.651288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-bundle" (OuterVolumeSpecName: "bundle") pod "c68c1569-5f7c-405d-8179-795ab29eb2b4" (UID: "c68c1569-5f7c-405d-8179-795ab29eb2b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.656365 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68c1569-5f7c-405d-8179-795ab29eb2b4-kube-api-access-tnrdc" (OuterVolumeSpecName: "kube-api-access-tnrdc") pod "c68c1569-5f7c-405d-8179-795ab29eb2b4" (UID: "c68c1569-5f7c-405d-8179-795ab29eb2b4"). InnerVolumeSpecName "kube-api-access-tnrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.666975 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-util" (OuterVolumeSpecName: "util") pod "c68c1569-5f7c-405d-8179-795ab29eb2b4" (UID: "c68c1569-5f7c-405d-8179-795ab29eb2b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.751789 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-util\") on node \"crc\" DevicePath \"\"" Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.751832 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c68c1569-5f7c-405d-8179-795ab29eb2b4-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:34:07 crc kubenswrapper[4909]: I1002 18:34:07.751846 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnrdc\" (UniqueName: \"kubernetes.io/projected/c68c1569-5f7c-405d-8179-795ab29eb2b4-kube-api-access-tnrdc\") on node \"crc\" DevicePath \"\"" Oct 02 18:34:08 crc kubenswrapper[4909]: I1002 18:34:08.267618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" event={"ID":"c68c1569-5f7c-405d-8179-795ab29eb2b4","Type":"ContainerDied","Data":"01c499ca2de6bd5cfa271937e68db7f2855cf9448362e745081a2510a790eb5e"} Oct 02 18:34:08 crc kubenswrapper[4909]: I1002 18:34:08.267661 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01c499ca2de6bd5cfa271937e68db7f2855cf9448362e745081a2510a790eb5e" Oct 02 18:34:08 crc kubenswrapper[4909]: I1002 18:34:08.267742 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.125741 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5"] Oct 02 18:34:18 crc kubenswrapper[4909]: E1002 18:34:18.126894 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="extract" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.126912 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="extract" Oct 02 18:34:18 crc kubenswrapper[4909]: E1002 18:34:18.126926 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="util" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.126934 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="util" Oct 02 18:34:18 crc kubenswrapper[4909]: E1002 18:34:18.126944 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="pull" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.126954 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="pull" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.127158 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68c1569-5f7c-405d-8179-795ab29eb2b4" containerName="extract" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.127781 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.131039 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.131698 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.131913 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.131978 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.132281 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lp7dx" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.132475 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04606b24-1344-4bc4-a7b5-c5bd3282afec-webhook-cert\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.132630 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04606b24-1344-4bc4-a7b5-c5bd3282afec-apiservice-cert\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.132757 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4kqt\" (UniqueName: \"kubernetes.io/projected/04606b24-1344-4bc4-a7b5-c5bd3282afec-kube-api-access-x4kqt\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.157396 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5"] Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.235328 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04606b24-1344-4bc4-a7b5-c5bd3282afec-apiservice-cert\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.235694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4kqt\" (UniqueName: \"kubernetes.io/projected/04606b24-1344-4bc4-a7b5-c5bd3282afec-kube-api-access-x4kqt\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.235857 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04606b24-1344-4bc4-a7b5-c5bd3282afec-webhook-cert\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.243496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04606b24-1344-4bc4-a7b5-c5bd3282afec-webhook-cert\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.246711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04606b24-1344-4bc4-a7b5-c5bd3282afec-apiservice-cert\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.267991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4kqt\" (UniqueName: \"kubernetes.io/projected/04606b24-1344-4bc4-a7b5-c5bd3282afec-kube-api-access-x4kqt\") pod \"metallb-operator-controller-manager-568bfffc64-vr4t5\" (UID: \"04606b24-1344-4bc4-a7b5-c5bd3282afec\") " pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.445506 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.527738 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj"] Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.528771 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.532588 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.532807 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qfwdx" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.536551 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.547924 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj"] Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.641719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmvgs\" (UniqueName: \"kubernetes.io/projected/9ffcfe84-1e40-4d9f-b411-4f8707346e92-kube-api-access-xmvgs\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.642051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ffcfe84-1e40-4d9f-b411-4f8707346e92-apiservice-cert\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.642090 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ffcfe84-1e40-4d9f-b411-4f8707346e92-webhook-cert\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.743356 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmvgs\" (UniqueName: \"kubernetes.io/projected/9ffcfe84-1e40-4d9f-b411-4f8707346e92-kube-api-access-xmvgs\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.743416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ffcfe84-1e40-4d9f-b411-4f8707346e92-apiservice-cert\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.743448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ffcfe84-1e40-4d9f-b411-4f8707346e92-webhook-cert\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.750827 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ffcfe84-1e40-4d9f-b411-4f8707346e92-webhook-cert\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.752621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ffcfe84-1e40-4d9f-b411-4f8707346e92-apiservice-cert\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.766543 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmvgs\" (UniqueName: \"kubernetes.io/projected/9ffcfe84-1e40-4d9f-b411-4f8707346e92-kube-api-access-xmvgs\") pod \"metallb-operator-webhook-server-f8cdfbb4-bh9bj\" (UID: \"9ffcfe84-1e40-4d9f-b411-4f8707346e92\") " pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.855153 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:18 crc kubenswrapper[4909]: I1002 18:34:18.986171 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5"] Oct 02 18:34:18 crc kubenswrapper[4909]: W1002 18:34:18.997900 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04606b24_1344_4bc4_a7b5_c5bd3282afec.slice/crio-dcddeb322141c9b1a894b3d87218fb1d558eb84c6b9ff746e1fcef4e6c8e9fff WatchSource:0}: Error finding container dcddeb322141c9b1a894b3d87218fb1d558eb84c6b9ff746e1fcef4e6c8e9fff: Status 404 returned error can't find the container with id dcddeb322141c9b1a894b3d87218fb1d558eb84c6b9ff746e1fcef4e6c8e9fff Oct 02 18:34:19 crc kubenswrapper[4909]: I1002 18:34:19.343536 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj"] Oct 02 18:34:19 crc kubenswrapper[4909]: I1002 18:34:19.363175 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" event={"ID":"9ffcfe84-1e40-4d9f-b411-4f8707346e92","Type":"ContainerStarted","Data":"ffc0e5910a450d79748cec4229380f541e9dea12282d1049f66dd449aba2daf1"} Oct 02 18:34:19 crc kubenswrapper[4909]: I1002 18:34:19.365098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" event={"ID":"04606b24-1344-4bc4-a7b5-c5bd3282afec","Type":"ContainerStarted","Data":"dcddeb322141c9b1a894b3d87218fb1d558eb84c6b9ff746e1fcef4e6c8e9fff"} Oct 02 18:34:23 crc kubenswrapper[4909]: I1002 18:34:23.054360 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:34:23 crc kubenswrapper[4909]: I1002 18:34:23.054916 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:34:23 crc kubenswrapper[4909]: I1002 18:34:23.423609 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" event={"ID":"04606b24-1344-4bc4-a7b5-c5bd3282afec","Type":"ContainerStarted","Data":"e5d922c4e45081c0b74eb1be7c67fa067ac1a7e7aef39d75c366228a1ccf0ca5"} Oct 02 18:34:23 crc kubenswrapper[4909]: I1002 18:34:23.425207 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:23 crc kubenswrapper[4909]: I1002 18:34:23.460859 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" podStartSLOduration=2.019716703 podStartE2EDuration="5.46084024s" podCreationTimestamp="2025-10-02 18:34:18 +0000 UTC" firstStartedPulling="2025-10-02 18:34:19.000785572 +0000 UTC m=+980.188281481" lastFinishedPulling="2025-10-02 18:34:22.441909159 +0000 UTC m=+983.629405018" observedRunningTime="2025-10-02 18:34:23.446104433 +0000 UTC m=+984.633600302" watchObservedRunningTime="2025-10-02 18:34:23.46084024 +0000 UTC m=+984.648336119" Oct 02 18:34:24 crc kubenswrapper[4909]: I1002 18:34:24.432157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" event={"ID":"9ffcfe84-1e40-4d9f-b411-4f8707346e92","Type":"ContainerStarted","Data":"1708cc2651db622b751804263f7b603e6bc038c4cc0df510a2a4670d0052927a"} Oct 02 18:34:24 crc kubenswrapper[4909]: I1002 18:34:24.432594 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:24 crc kubenswrapper[4909]: I1002 18:34:24.457976 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" podStartSLOduration=1.571016584 podStartE2EDuration="6.457949074s" podCreationTimestamp="2025-10-02 18:34:18 +0000 UTC" firstStartedPulling="2025-10-02 18:34:19.348556468 +0000 UTC m=+980.536052327" lastFinishedPulling="2025-10-02 18:34:24.235488928 +0000 UTC m=+985.422984817" observedRunningTime="2025-10-02 18:34:24.453076374 +0000 UTC m=+985.640572283" watchObservedRunningTime="2025-10-02 18:34:24.457949074 +0000 UTC m=+985.645444953" Oct 02 18:34:38 crc kubenswrapper[4909]: I1002 18:34:38.865526 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-f8cdfbb4-bh9bj" Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.054803 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.055551 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.055618 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.056654 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"573b7146b529d3438c92c471059130780d1a5678b93eb10b123e4e2ecdc9d9a5"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.056760 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://573b7146b529d3438c92c471059130780d1a5678b93eb10b123e4e2ecdc9d9a5" gracePeriod=600 Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.652762 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="573b7146b529d3438c92c471059130780d1a5678b93eb10b123e4e2ecdc9d9a5" exitCode=0 Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.652848 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"573b7146b529d3438c92c471059130780d1a5678b93eb10b123e4e2ecdc9d9a5"} Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.653173 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"cdce1121ba765dae4e3ecbe0b01dbaa7f404571af7f88aa30e85e68bfec50aa5"} Oct 02 18:34:53 crc kubenswrapper[4909]: I1002 18:34:53.653203 4909 scope.go:117] "RemoveContainer" containerID="bbfbb601ff2a52b11f89346581838b2c67f48bfb68a316624366c9e5079f4722" Oct 02 18:34:58 crc kubenswrapper[4909]: I1002 18:34:58.449392 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-568bfffc64-vr4t5" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.262705 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lwgmx"] Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.265667 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.268242 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7pfdt" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.268293 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.268438 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.329535 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p"] Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.330398 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.332161 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.349462 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p"] Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.422153 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-x67sq"] Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.423276 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.425737 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.425985 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g8pq6" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.426549 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-vwbdq"] Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.426644 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.426912 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.427625 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.428899 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435538 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/843ddbf5-2368-4807-903e-41a960b8e78e-frr-startup\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843ddbf5-2368-4807-903e-41a960b8e78e-metrics-certs\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tpqw\" (UniqueName: \"kubernetes.io/projected/94246755-9836-440f-a7d3-5189dd6d1b6f-kube-api-access-2tpqw\") pod \"frr-k8s-webhook-server-64bf5d555-9wd5p\" (UID: \"94246755-9836-440f-a7d3-5189dd6d1b6f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435660 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-reloader\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94246755-9836-440f-a7d3-5189dd6d1b6f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-9wd5p\" (UID: \"94246755-9836-440f-a7d3-5189dd6d1b6f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435716 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-frr-sockets\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435736 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd57l\" (UniqueName: \"kubernetes.io/projected/843ddbf5-2368-4807-903e-41a960b8e78e-kube-api-access-hd57l\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435764 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-metrics\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.435782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-frr-conf\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.442689 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-vwbdq"] Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537449 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tpqw\" (UniqueName: \"kubernetes.io/projected/94246755-9836-440f-a7d3-5189dd6d1b6f-kube-api-access-2tpqw\") pod \"frr-k8s-webhook-server-64bf5d555-9wd5p\" (UID: \"94246755-9836-440f-a7d3-5189dd6d1b6f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537494 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-reloader\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7da13784-fb30-48c8-8a21-99e151b56645-metallb-excludel2\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537544 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-cert\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537566 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94246755-9836-440f-a7d3-5189dd6d1b6f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-9wd5p\" (UID: \"94246755-9836-440f-a7d3-5189dd6d1b6f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-frr-sockets\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.537677 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd57l\" (UniqueName: \"kubernetes.io/projected/843ddbf5-2368-4807-903e-41a960b8e78e-kube-api-access-hd57l\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538020 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-reloader\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-frr-sockets\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538593 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-metrics-certs\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538702 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-metrics\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538778 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-metrics-certs\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538820 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-frr-conf\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538886 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9z7b\" (UniqueName: \"kubernetes.io/projected/7da13784-fb30-48c8-8a21-99e151b56645-kube-api-access-l9z7b\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538947 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwkf\" (UniqueName: \"kubernetes.io/projected/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-kube-api-access-6rwkf\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.538963 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-metrics\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.539069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/843ddbf5-2368-4807-903e-41a960b8e78e-frr-startup\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.539096 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/843ddbf5-2368-4807-903e-41a960b8e78e-frr-conf\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.539102 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843ddbf5-2368-4807-903e-41a960b8e78e-metrics-certs\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.539161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.540895 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.541461 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.541566 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 18:34:59 crc kubenswrapper[4909]: E1002 18:34:59.549216 4909 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 02 18:34:59 crc kubenswrapper[4909]: E1002 18:34:59.549278 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/843ddbf5-2368-4807-903e-41a960b8e78e-metrics-certs podName:843ddbf5-2368-4807-903e-41a960b8e78e nodeName:}" failed. No retries permitted until 2025-10-02 18:35:00.049252781 +0000 UTC m=+1021.236748650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/843ddbf5-2368-4807-903e-41a960b8e78e-metrics-certs") pod "frr-k8s-lwgmx" (UID: "843ddbf5-2368-4807-903e-41a960b8e78e") : secret "frr-k8s-certs-secret" not found Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.550183 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/843ddbf5-2368-4807-903e-41a960b8e78e-frr-startup\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.557651 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94246755-9836-440f-a7d3-5189dd6d1b6f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-9wd5p\" (UID: \"94246755-9836-440f-a7d3-5189dd6d1b6f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.567687 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tpqw\" (UniqueName: \"kubernetes.io/projected/94246755-9836-440f-a7d3-5189dd6d1b6f-kube-api-access-2tpqw\") pod \"frr-k8s-webhook-server-64bf5d555-9wd5p\" (UID: \"94246755-9836-440f-a7d3-5189dd6d1b6f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.574489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd57l\" (UniqueName: \"kubernetes.io/projected/843ddbf5-2368-4807-903e-41a960b8e78e-kube-api-access-hd57l\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640064 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7da13784-fb30-48c8-8a21-99e151b56645-metallb-excludel2\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640115 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-cert\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-metrics-certs\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-metrics-certs\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640517 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9z7b\" (UniqueName: \"kubernetes.io/projected/7da13784-fb30-48c8-8a21-99e151b56645-kube-api-access-l9z7b\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwkf\" (UniqueName: \"kubernetes.io/projected/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-kube-api-access-6rwkf\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.640717 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.642098 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.643677 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.643763 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.643886 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.643905 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 18:34:59 crc kubenswrapper[4909]: E1002 18:34:59.651291 4909 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 18:34:59 crc kubenswrapper[4909]: E1002 18:34:59.651377 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist podName:7da13784-fb30-48c8-8a21-99e151b56645 nodeName:}" failed. No retries permitted until 2025-10-02 18:35:00.151352702 +0000 UTC m=+1021.338848561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist") pod "speaker-x67sq" (UID: "7da13784-fb30-48c8-8a21-99e151b56645") : secret "metallb-memberlist" not found Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.652339 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7da13784-fb30-48c8-8a21-99e151b56645-metallb-excludel2\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.656348 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7pfdt" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.658084 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-metrics-certs\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.658591 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-metrics-certs\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.661467 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-cert\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.662896 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.666069 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwkf\" (UniqueName: \"kubernetes.io/projected/6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9-kube-api-access-6rwkf\") pod \"controller-68d546b9d8-vwbdq\" (UID: \"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9\") " pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.687271 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9z7b\" (UniqueName: \"kubernetes.io/projected/7da13784-fb30-48c8-8a21-99e151b56645-kube-api-access-l9z7b\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:34:59 crc kubenswrapper[4909]: I1002 18:34:59.757191 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.052878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843ddbf5-2368-4807-903e-41a960b8e78e-metrics-certs\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.058366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843ddbf5-2368-4807-903e-41a960b8e78e-metrics-certs\") pod \"frr-k8s-lwgmx\" (UID: \"843ddbf5-2368-4807-903e-41a960b8e78e\") " pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.085464 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p"] Oct 02 18:35:00 crc kubenswrapper[4909]: W1002 18:35:00.101861 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94246755_9836_440f_a7d3_5189dd6d1b6f.slice/crio-bf5c4a2ee32674c3fe1c563a2f5c197393d4a4804c1488435959bdcbbdacec83 WatchSource:0}: Error finding container bf5c4a2ee32674c3fe1c563a2f5c197393d4a4804c1488435959bdcbbdacec83: Status 404 returned error can't find the container with id bf5c4a2ee32674c3fe1c563a2f5c197393d4a4804c1488435959bdcbbdacec83 Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.154495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:35:00 crc kubenswrapper[4909]: E1002 18:35:00.154647 4909 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 18:35:00 crc kubenswrapper[4909]: E1002 18:35:00.154736 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist podName:7da13784-fb30-48c8-8a21-99e151b56645 nodeName:}" failed. No retries permitted until 2025-10-02 18:35:01.154715918 +0000 UTC m=+1022.342211787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist") pod "speaker-x67sq" (UID: "7da13784-fb30-48c8-8a21-99e151b56645") : secret "metallb-memberlist" not found Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.182464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.212671 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-vwbdq"] Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.729110 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" event={"ID":"94246755-9836-440f-a7d3-5189dd6d1b6f","Type":"ContainerStarted","Data":"bf5c4a2ee32674c3fe1c563a2f5c197393d4a4804c1488435959bdcbbdacec83"} Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.732259 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-vwbdq" event={"ID":"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9","Type":"ContainerStarted","Data":"bb3548ecd2a1deb4ffe626efacce303c54dbeb800ebd47f22f5933373f7839f8"} Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.732304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-vwbdq" event={"ID":"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9","Type":"ContainerStarted","Data":"86e13bee4421d266820058614344af2499f9f8869f92bd555b637e127fb3a0ae"} Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.732317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-vwbdq" event={"ID":"6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9","Type":"ContainerStarted","Data":"e381ba26ba0031ee6a6d50aafba8444cf1a7924efa930ca586f375acc412576a"} Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.732457 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.733273 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"fb608ee393d65e0a88f667c82765d55f6c8f29cc1bb858dea4cedaf0d94da52d"} Oct 02 18:35:00 crc kubenswrapper[4909]: I1002 18:35:00.748707 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-vwbdq" podStartSLOduration=1.748691747 podStartE2EDuration="1.748691747s" podCreationTimestamp="2025-10-02 18:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:35:00.746712676 +0000 UTC m=+1021.934208545" watchObservedRunningTime="2025-10-02 18:35:00.748691747 +0000 UTC m=+1021.936187606" Oct 02 18:35:01 crc kubenswrapper[4909]: I1002 18:35:01.171323 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:35:01 crc kubenswrapper[4909]: I1002 18:35:01.178296 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7da13784-fb30-48c8-8a21-99e151b56645-memberlist\") pod \"speaker-x67sq\" (UID: \"7da13784-fb30-48c8-8a21-99e151b56645\") " pod="metallb-system/speaker-x67sq" Oct 02 18:35:01 crc kubenswrapper[4909]: I1002 18:35:01.247150 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g8pq6" Oct 02 18:35:01 crc kubenswrapper[4909]: I1002 18:35:01.255250 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x67sq" Oct 02 18:35:01 crc kubenswrapper[4909]: W1002 18:35:01.288115 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da13784_fb30_48c8_8a21_99e151b56645.slice/crio-c631eb3c02e056b33e2e3e4a7ad4e7c5d03765e60318a5d5ffc969388d5828ba WatchSource:0}: Error finding container c631eb3c02e056b33e2e3e4a7ad4e7c5d03765e60318a5d5ffc969388d5828ba: Status 404 returned error can't find the container with id c631eb3c02e056b33e2e3e4a7ad4e7c5d03765e60318a5d5ffc969388d5828ba Oct 02 18:35:01 crc kubenswrapper[4909]: I1002 18:35:01.745250 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x67sq" event={"ID":"7da13784-fb30-48c8-8a21-99e151b56645","Type":"ContainerStarted","Data":"399a9e58cf6128f4dc6fb80c00b7dc5987e7ad8883dfec36b3236aa6d4e0072b"} Oct 02 18:35:01 crc kubenswrapper[4909]: I1002 18:35:01.745739 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x67sq" event={"ID":"7da13784-fb30-48c8-8a21-99e151b56645","Type":"ContainerStarted","Data":"c631eb3c02e056b33e2e3e4a7ad4e7c5d03765e60318a5d5ffc969388d5828ba"} Oct 02 18:35:02 crc kubenswrapper[4909]: I1002 18:35:02.753568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x67sq" event={"ID":"7da13784-fb30-48c8-8a21-99e151b56645","Type":"ContainerStarted","Data":"2d5800b58c0a14267b2bd4f641f56a828881e53b2756f9b461a26be4c0c1624b"} Oct 02 18:35:02 crc kubenswrapper[4909]: I1002 18:35:02.753805 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-x67sq" Oct 02 18:35:02 crc kubenswrapper[4909]: I1002 18:35:02.780516 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-x67sq" podStartSLOduration=3.780495913 podStartE2EDuration="3.780495913s" podCreationTimestamp="2025-10-02 18:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:35:02.776215461 +0000 UTC m=+1023.963711320" watchObservedRunningTime="2025-10-02 18:35:02.780495913 +0000 UTC m=+1023.967991772" Oct 02 18:35:07 crc kubenswrapper[4909]: I1002 18:35:07.798443 4909 generic.go:334] "Generic (PLEG): container finished" podID="843ddbf5-2368-4807-903e-41a960b8e78e" containerID="65ef98c84e8f03bf21afcdc4a0ce3e92dd4a9cebf2ee180857838e9067e53e0f" exitCode=0 Oct 02 18:35:07 crc kubenswrapper[4909]: I1002 18:35:07.798573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerDied","Data":"65ef98c84e8f03bf21afcdc4a0ce3e92dd4a9cebf2ee180857838e9067e53e0f"} Oct 02 18:35:07 crc kubenswrapper[4909]: I1002 18:35:07.801767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" event={"ID":"94246755-9836-440f-a7d3-5189dd6d1b6f","Type":"ContainerStarted","Data":"5e9bf36f150abde64b347e7d6919256f79a3ccd45fdbc954da115e57806c1e2f"} Oct 02 18:35:07 crc kubenswrapper[4909]: I1002 18:35:07.801940 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:35:07 crc kubenswrapper[4909]: I1002 18:35:07.852358 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" podStartSLOduration=1.667122005 podStartE2EDuration="8.852336094s" podCreationTimestamp="2025-10-02 18:34:59 +0000 UTC" firstStartedPulling="2025-10-02 18:35:00.103943062 +0000 UTC m=+1021.291438931" lastFinishedPulling="2025-10-02 18:35:07.289157161 +0000 UTC m=+1028.476653020" observedRunningTime="2025-10-02 18:35:07.848541287 +0000 UTC m=+1029.036037146" watchObservedRunningTime="2025-10-02 18:35:07.852336094 +0000 UTC m=+1029.039831953" Oct 02 18:35:08 crc kubenswrapper[4909]: I1002 18:35:08.813156 4909 generic.go:334] "Generic (PLEG): container finished" podID="843ddbf5-2368-4807-903e-41a960b8e78e" containerID="d8e23b27be93eea76cc6b666a4dbe93c88dfce0a3e83b81992b301973bf2f23a" exitCode=0 Oct 02 18:35:08 crc kubenswrapper[4909]: I1002 18:35:08.813218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerDied","Data":"d8e23b27be93eea76cc6b666a4dbe93c88dfce0a3e83b81992b301973bf2f23a"} Oct 02 18:35:09 crc kubenswrapper[4909]: I1002 18:35:09.826604 4909 generic.go:334] "Generic (PLEG): container finished" podID="843ddbf5-2368-4807-903e-41a960b8e78e" containerID="48a11b8677b03bc5746c3f538dfd3baa46f54b58a04544d721c7f568d9b965cd" exitCode=0 Oct 02 18:35:09 crc kubenswrapper[4909]: I1002 18:35:09.826669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerDied","Data":"48a11b8677b03bc5746c3f538dfd3baa46f54b58a04544d721c7f568d9b965cd"} Oct 02 18:35:10 crc kubenswrapper[4909]: I1002 18:35:10.838278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"f8e030740ddf5e6a4b30332d7e6f9b588269fb3401b6e46d3daa4da3cbe0b5ed"} Oct 02 18:35:10 crc kubenswrapper[4909]: I1002 18:35:10.838747 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"54937f6f43eed804ff04683f2a678915ddc339cd20837b74c3a6dacee968b54c"} Oct 02 18:35:10 crc kubenswrapper[4909]: I1002 18:35:10.838763 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"f4e12a00ddd8ca12c3e0fdeddc7c763fffc73968a6596f8b14df5ca005e6381e"} Oct 02 18:35:10 crc kubenswrapper[4909]: I1002 18:35:10.838773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"f67b0ff208df05a949034a914df42378bcabf45930a329f18c87692d82d7d0bb"} Oct 02 18:35:10 crc kubenswrapper[4909]: I1002 18:35:10.838782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"496902d0341afd6b4440426667aa50fadae44680d1e11b2b7d7b1d3a8f771190"} Oct 02 18:35:11 crc kubenswrapper[4909]: I1002 18:35:11.261113 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-x67sq" Oct 02 18:35:11 crc kubenswrapper[4909]: I1002 18:35:11.856502 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lwgmx" event={"ID":"843ddbf5-2368-4807-903e-41a960b8e78e","Type":"ContainerStarted","Data":"c82437050562a77840154785b08e61290dbdbc7944047d043e43856d1d1b839b"} Oct 02 18:35:11 crc kubenswrapper[4909]: I1002 18:35:11.857517 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:11 crc kubenswrapper[4909]: I1002 18:35:11.897289 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lwgmx" podStartSLOduration=5.902407816 podStartE2EDuration="12.897270095s" podCreationTimestamp="2025-10-02 18:34:59 +0000 UTC" firstStartedPulling="2025-10-02 18:35:00.324705906 +0000 UTC m=+1021.512201765" lastFinishedPulling="2025-10-02 18:35:07.319568185 +0000 UTC m=+1028.507064044" observedRunningTime="2025-10-02 18:35:11.894253782 +0000 UTC m=+1033.081749661" watchObservedRunningTime="2025-10-02 18:35:11.897270095 +0000 UTC m=+1033.084765964" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.251338 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4tmp9"] Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.252833 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.256115 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2kqv7" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.257910 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.265637 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4tmp9"] Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.269631 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.306374 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x44vw\" (UniqueName: \"kubernetes.io/projected/c61f8b7b-a66d-414a-99ae-118ec77d07c2-kube-api-access-x44vw\") pod \"openstack-operator-index-4tmp9\" (UID: \"c61f8b7b-a66d-414a-99ae-118ec77d07c2\") " pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.407641 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x44vw\" (UniqueName: \"kubernetes.io/projected/c61f8b7b-a66d-414a-99ae-118ec77d07c2-kube-api-access-x44vw\") pod \"openstack-operator-index-4tmp9\" (UID: \"c61f8b7b-a66d-414a-99ae-118ec77d07c2\") " pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.436802 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x44vw\" (UniqueName: \"kubernetes.io/projected/c61f8b7b-a66d-414a-99ae-118ec77d07c2-kube-api-access-x44vw\") pod \"openstack-operator-index-4tmp9\" (UID: \"c61f8b7b-a66d-414a-99ae-118ec77d07c2\") " pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:14 crc kubenswrapper[4909]: I1002 18:35:14.577044 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:15 crc kubenswrapper[4909]: I1002 18:35:15.044152 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4tmp9"] Oct 02 18:35:15 crc kubenswrapper[4909]: W1002 18:35:15.054383 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc61f8b7b_a66d_414a_99ae_118ec77d07c2.slice/crio-433d2762f7d08dfd6e140a1fc55cbd2961f07ee8eb13ab1ee013d92dee89c913 WatchSource:0}: Error finding container 433d2762f7d08dfd6e140a1fc55cbd2961f07ee8eb13ab1ee013d92dee89c913: Status 404 returned error can't find the container with id 433d2762f7d08dfd6e140a1fc55cbd2961f07ee8eb13ab1ee013d92dee89c913 Oct 02 18:35:15 crc kubenswrapper[4909]: I1002 18:35:15.183357 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:15 crc kubenswrapper[4909]: I1002 18:35:15.230253 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:15 crc kubenswrapper[4909]: I1002 18:35:15.895567 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4tmp9" event={"ID":"c61f8b7b-a66d-414a-99ae-118ec77d07c2","Type":"ContainerStarted","Data":"433d2762f7d08dfd6e140a1fc55cbd2961f07ee8eb13ab1ee013d92dee89c913"} Oct 02 18:35:17 crc kubenswrapper[4909]: I1002 18:35:17.618690 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4tmp9"] Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.210270 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6mfjw"] Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.211676 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.224306 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6mfjw"] Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.386633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6c8\" (UniqueName: \"kubernetes.io/projected/b138a340-c0d7-436b-8308-14e2ea7a76a9-kube-api-access-qw6c8\") pod \"openstack-operator-index-6mfjw\" (UID: \"b138a340-c0d7-436b-8308-14e2ea7a76a9\") " pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.488855 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6c8\" (UniqueName: \"kubernetes.io/projected/b138a340-c0d7-436b-8308-14e2ea7a76a9-kube-api-access-qw6c8\") pod \"openstack-operator-index-6mfjw\" (UID: \"b138a340-c0d7-436b-8308-14e2ea7a76a9\") " pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.511633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6c8\" (UniqueName: \"kubernetes.io/projected/b138a340-c0d7-436b-8308-14e2ea7a76a9-kube-api-access-qw6c8\") pod \"openstack-operator-index-6mfjw\" (UID: \"b138a340-c0d7-436b-8308-14e2ea7a76a9\") " pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.555238 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.920897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4tmp9" event={"ID":"c61f8b7b-a66d-414a-99ae-118ec77d07c2","Type":"ContainerStarted","Data":"c045e6510ad52c18222a021548816043dc8036e6f6e32d73dc38dfa2045d14e7"} Oct 02 18:35:18 crc kubenswrapper[4909]: I1002 18:35:18.921066 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-4tmp9" podUID="c61f8b7b-a66d-414a-99ae-118ec77d07c2" containerName="registry-server" containerID="cri-o://c045e6510ad52c18222a021548816043dc8036e6f6e32d73dc38dfa2045d14e7" gracePeriod=2 Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.007656 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4tmp9" podStartSLOduration=1.71907488 podStartE2EDuration="5.007636361s" podCreationTimestamp="2025-10-02 18:35:14 +0000 UTC" firstStartedPulling="2025-10-02 18:35:15.059489964 +0000 UTC m=+1036.246985823" lastFinishedPulling="2025-10-02 18:35:18.348051435 +0000 UTC m=+1039.535547304" observedRunningTime="2025-10-02 18:35:18.946956167 +0000 UTC m=+1040.134452036" watchObservedRunningTime="2025-10-02 18:35:19.007636361 +0000 UTC m=+1040.195132220" Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.010980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6mfjw"] Oct 02 18:35:19 crc kubenswrapper[4909]: W1002 18:35:19.018514 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb138a340_c0d7_436b_8308_14e2ea7a76a9.slice/crio-fbc7411cccc49f4b12cb0049dfaedbfcf3acb00e0235291dc67cf82d4dbefc13 WatchSource:0}: Error finding container fbc7411cccc49f4b12cb0049dfaedbfcf3acb00e0235291dc67cf82d4dbefc13: Status 404 returned error can't find the container with id fbc7411cccc49f4b12cb0049dfaedbfcf3acb00e0235291dc67cf82d4dbefc13 Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.670942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9wd5p" Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.765641 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-vwbdq" Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.934126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6mfjw" event={"ID":"b138a340-c0d7-436b-8308-14e2ea7a76a9","Type":"ContainerStarted","Data":"fbc7411cccc49f4b12cb0049dfaedbfcf3acb00e0235291dc67cf82d4dbefc13"} Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.935814 4909 generic.go:334] "Generic (PLEG): container finished" podID="c61f8b7b-a66d-414a-99ae-118ec77d07c2" containerID="c045e6510ad52c18222a021548816043dc8036e6f6e32d73dc38dfa2045d14e7" exitCode=0 Oct 02 18:35:19 crc kubenswrapper[4909]: I1002 18:35:19.935843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4tmp9" event={"ID":"c61f8b7b-a66d-414a-99ae-118ec77d07c2","Type":"ContainerDied","Data":"c045e6510ad52c18222a021548816043dc8036e6f6e32d73dc38dfa2045d14e7"} Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.078816 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.185384 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lwgmx" Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.216963 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x44vw\" (UniqueName: \"kubernetes.io/projected/c61f8b7b-a66d-414a-99ae-118ec77d07c2-kube-api-access-x44vw\") pod \"c61f8b7b-a66d-414a-99ae-118ec77d07c2\" (UID: \"c61f8b7b-a66d-414a-99ae-118ec77d07c2\") " Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.226182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61f8b7b-a66d-414a-99ae-118ec77d07c2-kube-api-access-x44vw" (OuterVolumeSpecName: "kube-api-access-x44vw") pod "c61f8b7b-a66d-414a-99ae-118ec77d07c2" (UID: "c61f8b7b-a66d-414a-99ae-118ec77d07c2"). InnerVolumeSpecName "kube-api-access-x44vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.319645 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x44vw\" (UniqueName: \"kubernetes.io/projected/c61f8b7b-a66d-414a-99ae-118ec77d07c2-kube-api-access-x44vw\") on node \"crc\" DevicePath \"\"" Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.960257 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6mfjw" event={"ID":"b138a340-c0d7-436b-8308-14e2ea7a76a9","Type":"ContainerStarted","Data":"1a20de2f1446e5fe051aef32967abb6ce2f23c46e1de3aff33be87c4391af7cc"} Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.964113 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4tmp9" event={"ID":"c61f8b7b-a66d-414a-99ae-118ec77d07c2","Type":"ContainerDied","Data":"433d2762f7d08dfd6e140a1fc55cbd2961f07ee8eb13ab1ee013d92dee89c913"} Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.964181 4909 scope.go:117] "RemoveContainer" containerID="c045e6510ad52c18222a021548816043dc8036e6f6e32d73dc38dfa2045d14e7" Oct 02 18:35:20 crc kubenswrapper[4909]: I1002 18:35:20.964208 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4tmp9" Oct 02 18:35:21 crc kubenswrapper[4909]: I1002 18:35:21.000727 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6mfjw" podStartSLOduration=2.184371982 podStartE2EDuration="3.000705184s" podCreationTimestamp="2025-10-02 18:35:18 +0000 UTC" firstStartedPulling="2025-10-02 18:35:19.021806291 +0000 UTC m=+1040.209302150" lastFinishedPulling="2025-10-02 18:35:19.838139453 +0000 UTC m=+1041.025635352" observedRunningTime="2025-10-02 18:35:20.992191059 +0000 UTC m=+1042.179686958" watchObservedRunningTime="2025-10-02 18:35:21.000705184 +0000 UTC m=+1042.188201053" Oct 02 18:35:21 crc kubenswrapper[4909]: I1002 18:35:21.030406 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4tmp9"] Oct 02 18:35:21 crc kubenswrapper[4909]: I1002 18:35:21.037109 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-4tmp9"] Oct 02 18:35:21 crc kubenswrapper[4909]: I1002 18:35:21.620186 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61f8b7b-a66d-414a-99ae-118ec77d07c2" path="/var/lib/kubelet/pods/c61f8b7b-a66d-414a-99ae-118ec77d07c2/volumes" Oct 02 18:35:28 crc kubenswrapper[4909]: I1002 18:35:28.555649 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:28 crc kubenswrapper[4909]: I1002 18:35:28.556560 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:28 crc kubenswrapper[4909]: I1002 18:35:28.599918 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:29 crc kubenswrapper[4909]: I1002 18:35:29.053009 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6mfjw" Oct 02 18:35:30 crc kubenswrapper[4909]: I1002 18:35:30.873284 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp"] Oct 02 18:35:30 crc kubenswrapper[4909]: E1002 18:35:30.875747 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61f8b7b-a66d-414a-99ae-118ec77d07c2" containerName="registry-server" Oct 02 18:35:30 crc kubenswrapper[4909]: I1002 18:35:30.875904 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61f8b7b-a66d-414a-99ae-118ec77d07c2" containerName="registry-server" Oct 02 18:35:30 crc kubenswrapper[4909]: I1002 18:35:30.876342 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61f8b7b-a66d-414a-99ae-118ec77d07c2" containerName="registry-server" Oct 02 18:35:30 crc kubenswrapper[4909]: I1002 18:35:30.878374 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:30 crc kubenswrapper[4909]: I1002 18:35:30.881974 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fc42l" Oct 02 18:35:30 crc kubenswrapper[4909]: I1002 18:35:30.887337 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp"] Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.004059 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-bundle\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.004437 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-util\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.004600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnk8h\" (UniqueName: \"kubernetes.io/projected/7adcf953-dc05-488e-a463-7a0574cd88bc-kube-api-access-hnk8h\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.106698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnk8h\" (UniqueName: \"kubernetes.io/projected/7adcf953-dc05-488e-a463-7a0574cd88bc-kube-api-access-hnk8h\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.106856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-bundle\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.107080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-util\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.107754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-bundle\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.107911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-util\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.132751 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnk8h\" (UniqueName: \"kubernetes.io/projected/7adcf953-dc05-488e-a463-7a0574cd88bc-kube-api-access-hnk8h\") pod \"393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.208553 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:31 crc kubenswrapper[4909]: I1002 18:35:31.753342 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp"] Oct 02 18:35:32 crc kubenswrapper[4909]: I1002 18:35:32.056174 4909 generic.go:334] "Generic (PLEG): container finished" podID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerID="39962173547cb62d2284ca74849924dd21b4937e33af2ca42b031a01e148fdd0" exitCode=0 Oct 02 18:35:32 crc kubenswrapper[4909]: I1002 18:35:32.056239 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" event={"ID":"7adcf953-dc05-488e-a463-7a0574cd88bc","Type":"ContainerDied","Data":"39962173547cb62d2284ca74849924dd21b4937e33af2ca42b031a01e148fdd0"} Oct 02 18:35:32 crc kubenswrapper[4909]: I1002 18:35:32.056310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" event={"ID":"7adcf953-dc05-488e-a463-7a0574cd88bc","Type":"ContainerStarted","Data":"70e4e36a83c20dcb396a1ba84140ebb032eef78bcfbd0f03821f4e760ce4f906"} Oct 02 18:35:33 crc kubenswrapper[4909]: I1002 18:35:33.075255 4909 generic.go:334] "Generic (PLEG): container finished" podID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerID="9975559450d1279b1db8e463998f36d15a0ddfb1c1093010cf999f5acb9dd70e" exitCode=0 Oct 02 18:35:33 crc kubenswrapper[4909]: I1002 18:35:33.075311 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" event={"ID":"7adcf953-dc05-488e-a463-7a0574cd88bc","Type":"ContainerDied","Data":"9975559450d1279b1db8e463998f36d15a0ddfb1c1093010cf999f5acb9dd70e"} Oct 02 18:35:34 crc kubenswrapper[4909]: I1002 18:35:34.092621 4909 generic.go:334] "Generic (PLEG): container finished" podID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerID="c4f84e2327f9a2bc6e68d2127cc9164a1f8d772f2ee0f16e7f35bece3815a1a6" exitCode=0 Oct 02 18:35:34 crc kubenswrapper[4909]: I1002 18:35:34.092730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" event={"ID":"7adcf953-dc05-488e-a463-7a0574cd88bc","Type":"ContainerDied","Data":"c4f84e2327f9a2bc6e68d2127cc9164a1f8d772f2ee0f16e7f35bece3815a1a6"} Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.472193 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.580825 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnk8h\" (UniqueName: \"kubernetes.io/projected/7adcf953-dc05-488e-a463-7a0574cd88bc-kube-api-access-hnk8h\") pod \"7adcf953-dc05-488e-a463-7a0574cd88bc\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.580890 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-util\") pod \"7adcf953-dc05-488e-a463-7a0574cd88bc\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.581081 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-bundle\") pod \"7adcf953-dc05-488e-a463-7a0574cd88bc\" (UID: \"7adcf953-dc05-488e-a463-7a0574cd88bc\") " Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.581762 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-bundle" (OuterVolumeSpecName: "bundle") pod "7adcf953-dc05-488e-a463-7a0574cd88bc" (UID: "7adcf953-dc05-488e-a463-7a0574cd88bc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.591538 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7adcf953-dc05-488e-a463-7a0574cd88bc-kube-api-access-hnk8h" (OuterVolumeSpecName: "kube-api-access-hnk8h") pod "7adcf953-dc05-488e-a463-7a0574cd88bc" (UID: "7adcf953-dc05-488e-a463-7a0574cd88bc"). InnerVolumeSpecName "kube-api-access-hnk8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.599251 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-util" (OuterVolumeSpecName: "util") pod "7adcf953-dc05-488e-a463-7a0574cd88bc" (UID: "7adcf953-dc05-488e-a463-7a0574cd88bc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.683203 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.683256 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnk8h\" (UniqueName: \"kubernetes.io/projected/7adcf953-dc05-488e-a463-7a0574cd88bc-kube-api-access-hnk8h\") on node \"crc\" DevicePath \"\"" Oct 02 18:35:35 crc kubenswrapper[4909]: I1002 18:35:35.683317 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7adcf953-dc05-488e-a463-7a0574cd88bc-util\") on node \"crc\" DevicePath \"\"" Oct 02 18:35:36 crc kubenswrapper[4909]: I1002 18:35:36.121933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" event={"ID":"7adcf953-dc05-488e-a463-7a0574cd88bc","Type":"ContainerDied","Data":"70e4e36a83c20dcb396a1ba84140ebb032eef78bcfbd0f03821f4e760ce4f906"} Oct 02 18:35:36 crc kubenswrapper[4909]: I1002 18:35:36.122421 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e4e36a83c20dcb396a1ba84140ebb032eef78bcfbd0f03821f4e760ce4f906" Oct 02 18:35:36 crc kubenswrapper[4909]: I1002 18:35:36.122087 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.443935 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp"] Oct 02 18:35:43 crc kubenswrapper[4909]: E1002 18:35:43.446111 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="util" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.446133 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="util" Oct 02 18:35:43 crc kubenswrapper[4909]: E1002 18:35:43.446156 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="extract" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.446166 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="extract" Oct 02 18:35:43 crc kubenswrapper[4909]: E1002 18:35:43.446193 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="pull" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.446202 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="pull" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.446532 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7adcf953-dc05-488e-a463-7a0574cd88bc" containerName="extract" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.458350 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.466177 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-q7fdh" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.483417 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp"] Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.615627 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff57t\" (UniqueName: \"kubernetes.io/projected/16c56fa3-a7be-4a4c-ba04-967d7a5f1fec-kube-api-access-ff57t\") pod \"openstack-operator-controller-operator-7c58d4ffff-87ksp\" (UID: \"16c56fa3-a7be-4a4c-ba04-967d7a5f1fec\") " pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.717653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff57t\" (UniqueName: \"kubernetes.io/projected/16c56fa3-a7be-4a4c-ba04-967d7a5f1fec-kube-api-access-ff57t\") pod \"openstack-operator-controller-operator-7c58d4ffff-87ksp\" (UID: \"16c56fa3-a7be-4a4c-ba04-967d7a5f1fec\") " pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.744312 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff57t\" (UniqueName: \"kubernetes.io/projected/16c56fa3-a7be-4a4c-ba04-967d7a5f1fec-kube-api-access-ff57t\") pod \"openstack-operator-controller-operator-7c58d4ffff-87ksp\" (UID: \"16c56fa3-a7be-4a4c-ba04-967d7a5f1fec\") " pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:43 crc kubenswrapper[4909]: I1002 18:35:43.797908 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:44 crc kubenswrapper[4909]: I1002 18:35:44.325367 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp"] Oct 02 18:35:45 crc kubenswrapper[4909]: I1002 18:35:45.199536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" event={"ID":"16c56fa3-a7be-4a4c-ba04-967d7a5f1fec","Type":"ContainerStarted","Data":"896981f88f9269cdbe9e34078f4d62408c87ffc7e78a9a6ad0ef3860385c1950"} Oct 02 18:35:49 crc kubenswrapper[4909]: I1002 18:35:49.234756 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" event={"ID":"16c56fa3-a7be-4a4c-ba04-967d7a5f1fec","Type":"ContainerStarted","Data":"d4a01c6432e6fe9635e4e448715c1765bcff634c96eed30249c5c0346100b919"} Oct 02 18:35:53 crc kubenswrapper[4909]: I1002 18:35:53.277749 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" event={"ID":"16c56fa3-a7be-4a4c-ba04-967d7a5f1fec","Type":"ContainerStarted","Data":"ae35565b44b52096016890d0c8f77d26438bbfa50b7606e73cd2e6bec71b6e01"} Oct 02 18:35:53 crc kubenswrapper[4909]: I1002 18:35:53.278237 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:53 crc kubenswrapper[4909]: I1002 18:35:53.281688 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" Oct 02 18:35:53 crc kubenswrapper[4909]: I1002 18:35:53.332523 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7c58d4ffff-87ksp" podStartSLOduration=2.061161841 podStartE2EDuration="10.332506547s" podCreationTimestamp="2025-10-02 18:35:43 +0000 UTC" firstStartedPulling="2025-10-02 18:35:44.342587493 +0000 UTC m=+1065.530083362" lastFinishedPulling="2025-10-02 18:35:52.613932199 +0000 UTC m=+1073.801428068" observedRunningTime="2025-10-02 18:35:53.325975793 +0000 UTC m=+1074.513471712" watchObservedRunningTime="2025-10-02 18:35:53.332506547 +0000 UTC m=+1074.520002406" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.233558 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.237259 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.240059 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dnws8" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.240907 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.242479 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.243975 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wzwwb" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.249909 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.290644 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.291975 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.295796 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.297302 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plwst\" (UniqueName: \"kubernetes.io/projected/6f98751d-f257-4445-a739-7ff447f1c3d8-kube-api-access-plwst\") pod \"designate-operator-controller-manager-75dfd9b554-9jlbp\" (UID: \"6f98751d-f257-4445-a739-7ff447f1c3d8\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.297397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhzmm\" (UniqueName: \"kubernetes.io/projected/848033b4-f22a-4455-8699-45b7761bbee2-kube-api-access-xhzmm\") pod \"barbican-operator-controller-manager-6c675fb79f-fz856\" (UID: \"848033b4-f22a-4455-8699-45b7761bbee2\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.297536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6xg\" (UniqueName: \"kubernetes.io/projected/c108279d-762f-41ee-a725-55e9f58a8686-kube-api-access-ms6xg\") pod \"cinder-operator-controller-manager-79d68d6c85-2lwdw\" (UID: \"c108279d-762f-41ee-a725-55e9f58a8686\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.299742 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bgg5l" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.352698 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-87dms"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.353910 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.357267 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4lgbz" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.374435 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.386526 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-87dms"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.393270 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-28j82"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.398334 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plwst\" (UniqueName: \"kubernetes.io/projected/6f98751d-f257-4445-a739-7ff447f1c3d8-kube-api-access-plwst\") pod \"designate-operator-controller-manager-75dfd9b554-9jlbp\" (UID: \"6f98751d-f257-4445-a739-7ff447f1c3d8\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.398381 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5km\" (UniqueName: \"kubernetes.io/projected/27b610f7-4219-4cb1-97b4-76a4627afc7a-kube-api-access-sj5km\") pod \"glance-operator-controller-manager-846dff85b5-87dms\" (UID: \"27b610f7-4219-4cb1-97b4-76a4627afc7a\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.398407 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhzmm\" (UniqueName: \"kubernetes.io/projected/848033b4-f22a-4455-8699-45b7761bbee2-kube-api-access-xhzmm\") pod \"barbican-operator-controller-manager-6c675fb79f-fz856\" (UID: \"848033b4-f22a-4455-8699-45b7761bbee2\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.398462 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6xg\" (UniqueName: \"kubernetes.io/projected/c108279d-762f-41ee-a725-55e9f58a8686-kube-api-access-ms6xg\") pod \"cinder-operator-controller-manager-79d68d6c85-2lwdw\" (UID: \"c108279d-762f-41ee-a725-55e9f58a8686\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.404709 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-28j82"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.404862 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.406259 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.407995 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.412242 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-c9wsd" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.418089 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.419729 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.432299 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.432371 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.432625 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kb8bx" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.434428 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.440248 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhzmm\" (UniqueName: \"kubernetes.io/projected/848033b4-f22a-4455-8699-45b7761bbee2-kube-api-access-xhzmm\") pod \"barbican-operator-controller-manager-6c675fb79f-fz856\" (UID: \"848033b4-f22a-4455-8699-45b7761bbee2\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.442943 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plwst\" (UniqueName: \"kubernetes.io/projected/6f98751d-f257-4445-a739-7ff447f1c3d8-kube-api-access-plwst\") pod \"designate-operator-controller-manager-75dfd9b554-9jlbp\" (UID: \"6f98751d-f257-4445-a739-7ff447f1c3d8\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.442968 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.450500 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.459372 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.448492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6xg\" (UniqueName: \"kubernetes.io/projected/c108279d-762f-41ee-a725-55e9f58a8686-kube-api-access-ms6xg\") pod \"cinder-operator-controller-manager-79d68d6c85-2lwdw\" (UID: \"c108279d-762f-41ee-a725-55e9f58a8686\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.452832 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kbgml" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.461870 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.466176 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.462247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.452086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.466826 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.473636 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.474117 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.475789 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-46b2r" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.477933 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qwb2r" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.479389 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-w9bq2" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.479767 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-dqzjr" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.488119 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.491437 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.499714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5km\" (UniqueName: \"kubernetes.io/projected/27b610f7-4219-4cb1-97b4-76a4627afc7a-kube-api-access-sj5km\") pod \"glance-operator-controller-manager-846dff85b5-87dms\" (UID: \"27b610f7-4219-4cb1-97b4-76a4627afc7a\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.500816 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.511360 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.512190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.517460 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-s6z6q" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.533819 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.535210 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.539432 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-gz7wm" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.547113 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.548410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.552301 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f787g" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.560169 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.573187 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5km\" (UniqueName: \"kubernetes.io/projected/27b610f7-4219-4cb1-97b4-76a4627afc7a-kube-api-access-sj5km\") pod \"glance-operator-controller-manager-846dff85b5-87dms\" (UID: \"27b610f7-4219-4cb1-97b4-76a4627afc7a\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.580255 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.582445 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.605892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfstk\" (UniqueName: \"kubernetes.io/projected/ba5e23b3-da6b-41ec-9cac-11d4bc3a068b-kube-api-access-dfstk\") pod \"horizon-operator-controller-manager-6769b867d9-hkdrb\" (UID: \"ba5e23b3-da6b-41ec-9cac-11d4bc3a068b\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ntj\" (UniqueName: \"kubernetes.io/projected/5198c9b3-5c15-49de-9f3a-da04c80c4eb3-kube-api-access-m8ntj\") pod \"neutron-operator-controller-manager-6574bf987d-94lx8\" (UID: \"5198c9b3-5c15-49de-9f3a-da04c80c4eb3\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609112 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6lv\" (UniqueName: \"kubernetes.io/projected/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-kube-api-access-wj6lv\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609136 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxhmg\" (UniqueName: \"kubernetes.io/projected/d89c2056-bfda-4177-86e6-bc00964d5f22-kube-api-access-sxhmg\") pod \"octavia-operator-controller-manager-59d6cfdf45-g2lmn\" (UID: \"d89c2056-bfda-4177-86e6-bc00964d5f22\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609160 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccgxh\" (UniqueName: \"kubernetes.io/projected/c2bd7e2c-d50f-4808-85f5-4bae9b91d272-kube-api-access-ccgxh\") pod \"mariadb-operator-controller-manager-5c468bf4d4-57hjm\" (UID: \"c2bd7e2c-d50f-4808-85f5-4bae9b91d272\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609177 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzj2g\" (UniqueName: \"kubernetes.io/projected/30b85c17-5d08-452e-9099-20e55cc86f9e-kube-api-access-kzj2g\") pod \"ironic-operator-controller-manager-84bc9db6cc-886ls\" (UID: \"30b85c17-5d08-452e-9099-20e55cc86f9e\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609198 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxpxs\" (UniqueName: \"kubernetes.io/projected/22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff-kube-api-access-fxpxs\") pod \"manila-operator-controller-manager-6fd6854b49-hwgrp\" (UID: \"22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mjw\" (UniqueName: \"kubernetes.io/projected/f87d6d15-86ff-46e3-8361-447ce9aff98c-kube-api-access-w7mjw\") pod \"heat-operator-controller-manager-599898f689-28j82\" (UID: \"f87d6d15-86ff-46e3-8361-447ce9aff98c\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmzb\" (UniqueName: \"kubernetes.io/projected/6d4671d7-88da-4834-afb5-5deaf7e84cdb-kube-api-access-pjmzb\") pod \"keystone-operator-controller-manager-7f55849f88-9mjtf\" (UID: \"6d4671d7-88da-4834-afb5-5deaf7e84cdb\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2kx\" (UniqueName: \"kubernetes.io/projected/18e949b2-6404-43e9-954b-6a09780bf021-kube-api-access-xx2kx\") pod \"nova-operator-controller-manager-555c7456bd-zxm4p\" (UID: \"18e949b2-6404-43e9-954b-6a09780bf021\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.609332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.636150 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.637078 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.638525 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.649905 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9h2cc" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.681131 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.685734 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.687898 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.688166 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qh5gg" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.716082 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ntj\" (UniqueName: \"kubernetes.io/projected/5198c9b3-5c15-49de-9f3a-da04c80c4eb3-kube-api-access-m8ntj\") pod \"neutron-operator-controller-manager-6574bf987d-94lx8\" (UID: \"5198c9b3-5c15-49de-9f3a-da04c80c4eb3\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724266 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6lv\" (UniqueName: \"kubernetes.io/projected/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-kube-api-access-wj6lv\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxhmg\" (UniqueName: \"kubernetes.io/projected/d89c2056-bfda-4177-86e6-bc00964d5f22-kube-api-access-sxhmg\") pod \"octavia-operator-controller-manager-59d6cfdf45-g2lmn\" (UID: \"d89c2056-bfda-4177-86e6-bc00964d5f22\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccgxh\" (UniqueName: \"kubernetes.io/projected/c2bd7e2c-d50f-4808-85f5-4bae9b91d272-kube-api-access-ccgxh\") pod \"mariadb-operator-controller-manager-5c468bf4d4-57hjm\" (UID: \"c2bd7e2c-d50f-4808-85f5-4bae9b91d272\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxpxs\" (UniqueName: \"kubernetes.io/projected/22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff-kube-api-access-fxpxs\") pod \"manila-operator-controller-manager-6fd6854b49-hwgrp\" (UID: \"22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzj2g\" (UniqueName: \"kubernetes.io/projected/30b85c17-5d08-452e-9099-20e55cc86f9e-kube-api-access-kzj2g\") pod \"ironic-operator-controller-manager-84bc9db6cc-886ls\" (UID: \"30b85c17-5d08-452e-9099-20e55cc86f9e\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mjw\" (UniqueName: \"kubernetes.io/projected/f87d6d15-86ff-46e3-8361-447ce9aff98c-kube-api-access-w7mjw\") pod \"heat-operator-controller-manager-599898f689-28j82\" (UID: \"f87d6d15-86ff-46e3-8361-447ce9aff98c\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmzb\" (UniqueName: \"kubernetes.io/projected/6d4671d7-88da-4834-afb5-5deaf7e84cdb-kube-api-access-pjmzb\") pod \"keystone-operator-controller-manager-7f55849f88-9mjtf\" (UID: \"6d4671d7-88da-4834-afb5-5deaf7e84cdb\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724602 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2kx\" (UniqueName: \"kubernetes.io/projected/18e949b2-6404-43e9-954b-6a09780bf021-kube-api-access-xx2kx\") pod \"nova-operator-controller-manager-555c7456bd-zxm4p\" (UID: \"18e949b2-6404-43e9-954b-6a09780bf021\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724660 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.724787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfstk\" (UniqueName: \"kubernetes.io/projected/ba5e23b3-da6b-41ec-9cac-11d4bc3a068b-kube-api-access-dfstk\") pod \"horizon-operator-controller-manager-6769b867d9-hkdrb\" (UID: \"ba5e23b3-da6b-41ec-9cac-11d4bc3a068b\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.727869 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.735490 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk"] Oct 02 18:36:10 crc kubenswrapper[4909]: E1002 18:36:10.735725 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 18:36:10 crc kubenswrapper[4909]: E1002 18:36:10.745606 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-cert podName:d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1 nodeName:}" failed. No retries permitted until 2025-10-02 18:36:11.24557237 +0000 UTC m=+1092.433068229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-cert") pod "infra-operator-controller-manager-5fbf469cd7-bh4kt" (UID: "d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1") : secret "infra-operator-webhook-server-cert" not found Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.764633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfstk\" (UniqueName: \"kubernetes.io/projected/ba5e23b3-da6b-41ec-9cac-11d4bc3a068b-kube-api-access-dfstk\") pod \"horizon-operator-controller-manager-6769b867d9-hkdrb\" (UID: \"ba5e23b3-da6b-41ec-9cac-11d4bc3a068b\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.776401 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mjw\" (UniqueName: \"kubernetes.io/projected/f87d6d15-86ff-46e3-8361-447ce9aff98c-kube-api-access-w7mjw\") pod \"heat-operator-controller-manager-599898f689-28j82\" (UID: \"f87d6d15-86ff-46e3-8361-447ce9aff98c\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.785506 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2kx\" (UniqueName: \"kubernetes.io/projected/18e949b2-6404-43e9-954b-6a09780bf021-kube-api-access-xx2kx\") pod \"nova-operator-controller-manager-555c7456bd-zxm4p\" (UID: \"18e949b2-6404-43e9-954b-6a09780bf021\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.785651 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.785960 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.786297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxhmg\" (UniqueName: \"kubernetes.io/projected/d89c2056-bfda-4177-86e6-bc00964d5f22-kube-api-access-sxhmg\") pod \"octavia-operator-controller-manager-59d6cfdf45-g2lmn\" (UID: \"d89c2056-bfda-4177-86e6-bc00964d5f22\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.787109 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccgxh\" (UniqueName: \"kubernetes.io/projected/c2bd7e2c-d50f-4808-85f5-4bae9b91d272-kube-api-access-ccgxh\") pod \"mariadb-operator-controller-manager-5c468bf4d4-57hjm\" (UID: \"c2bd7e2c-d50f-4808-85f5-4bae9b91d272\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.808810 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxpxs\" (UniqueName: \"kubernetes.io/projected/22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff-kube-api-access-fxpxs\") pod \"manila-operator-controller-manager-6fd6854b49-hwgrp\" (UID: \"22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.809266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6lv\" (UniqueName: \"kubernetes.io/projected/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-kube-api-access-wj6lv\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.811448 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.812175 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.813494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmzb\" (UniqueName: \"kubernetes.io/projected/6d4671d7-88da-4834-afb5-5deaf7e84cdb-kube-api-access-pjmzb\") pod \"keystone-operator-controller-manager-7f55849f88-9mjtf\" (UID: \"6d4671d7-88da-4834-afb5-5deaf7e84cdb\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.815325 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.815825 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zspk5" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.815882 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-67xx7" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.816012 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.825520 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzj2g\" (UniqueName: \"kubernetes.io/projected/30b85c17-5d08-452e-9099-20e55cc86f9e-kube-api-access-kzj2g\") pod \"ironic-operator-controller-manager-84bc9db6cc-886ls\" (UID: \"30b85c17-5d08-452e-9099-20e55cc86f9e\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.828786 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rt6l\" (UniqueName: \"kubernetes.io/projected/404b32bd-4a96-46d7-b8fa-bcd5fde074aa-kube-api-access-4rt6l\") pod \"ovn-operator-controller-manager-688db7b6c7-p9sbt\" (UID: \"404b32bd-4a96-46d7-b8fa-bcd5fde074aa\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.828966 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jggxs\" (UniqueName: \"kubernetes.io/projected/9a100c33-7cb7-4af6-8262-c6255b252462-kube-api-access-jggxs\") pod \"placement-operator-controller-manager-7d8bb7f44c-d5vkl\" (UID: \"9a100c33-7cb7-4af6-8262-c6255b252462\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.838700 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.842399 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ntj\" (UniqueName: \"kubernetes.io/projected/5198c9b3-5c15-49de-9f3a-da04c80c4eb3-kube-api-access-m8ntj\") pod \"neutron-operator-controller-manager-6574bf987d-94lx8\" (UID: \"5198c9b3-5c15-49de-9f3a-da04c80c4eb3\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.842817 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.868154 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.869961 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.872446 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.874742 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-l7bcj" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.907982 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.927408 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.931332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b96ea387-f91f-47ef-ba02-4f61e8a750a3-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.931365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46kjm\" (UniqueName: \"kubernetes.io/projected/b96ea387-f91f-47ef-ba02-4f61e8a750a3-kube-api-access-46kjm\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.931420 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rt6l\" (UniqueName: \"kubernetes.io/projected/404b32bd-4a96-46d7-b8fa-bcd5fde074aa-kube-api-access-4rt6l\") pod \"ovn-operator-controller-manager-688db7b6c7-p9sbt\" (UID: \"404b32bd-4a96-46d7-b8fa-bcd5fde074aa\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.931445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9d4\" (UniqueName: \"kubernetes.io/projected/989d2b9b-e976-4e58-a506-5d5755154be0-kube-api-access-4s9d4\") pod \"swift-operator-controller-manager-6859f9b676-dmsmk\" (UID: \"989d2b9b-e976-4e58-a506-5d5755154be0\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.931494 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jggxs\" (UniqueName: \"kubernetes.io/projected/9a100c33-7cb7-4af6-8262-c6255b252462-kube-api-access-jggxs\") pod \"placement-operator-controller-manager-7d8bb7f44c-d5vkl\" (UID: \"9a100c33-7cb7-4af6-8262-c6255b252462\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.948561 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5"] Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.975015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jggxs\" (UniqueName: \"kubernetes.io/projected/9a100c33-7cb7-4af6-8262-c6255b252462-kube-api-access-jggxs\") pod \"placement-operator-controller-manager-7d8bb7f44c-d5vkl\" (UID: \"9a100c33-7cb7-4af6-8262-c6255b252462\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.983752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rt6l\" (UniqueName: \"kubernetes.io/projected/404b32bd-4a96-46d7-b8fa-bcd5fde074aa-kube-api-access-4rt6l\") pod \"ovn-operator-controller-manager-688db7b6c7-p9sbt\" (UID: \"404b32bd-4a96-46d7-b8fa-bcd5fde074aa\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:36:10 crc kubenswrapper[4909]: I1002 18:36:10.987508 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.005615 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.021646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.028485 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.031735 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.032999 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6r2\" (UniqueName: \"kubernetes.io/projected/8e79ecec-ac98-42dd-b513-071fbd0e235a-kube-api-access-lw6r2\") pod \"telemetry-operator-controller-manager-769bf6645d-mpcx5\" (UID: \"8e79ecec-ac98-42dd-b513-071fbd0e235a\") " pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.033056 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b96ea387-f91f-47ef-ba02-4f61e8a750a3-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.033080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46kjm\" (UniqueName: \"kubernetes.io/projected/b96ea387-f91f-47ef-ba02-4f61e8a750a3-kube-api-access-46kjm\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.033268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9d4\" (UniqueName: \"kubernetes.io/projected/989d2b9b-e976-4e58-a506-5d5755154be0-kube-api-access-4s9d4\") pod \"swift-operator-controller-manager-6859f9b676-dmsmk\" (UID: \"989d2b9b-e976-4e58-a506-5d5755154be0\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:11 crc kubenswrapper[4909]: E1002 18:36:11.033490 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 18:36:11 crc kubenswrapper[4909]: E1002 18:36:11.038475 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96ea387-f91f-47ef-ba02-4f61e8a750a3-cert podName:b96ea387-f91f-47ef-ba02-4f61e8a750a3 nodeName:}" failed. No retries permitted until 2025-10-02 18:36:11.538445352 +0000 UTC m=+1092.725941211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b96ea387-f91f-47ef-ba02-4f61e8a750a3-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678btf42" (UID: "b96ea387-f91f-47ef-ba02-4f61e8a750a3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.039597 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.059293 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.060709 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7jpg6" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.068431 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.101060 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9d4\" (UniqueName: \"kubernetes.io/projected/989d2b9b-e976-4e58-a506-5d5755154be0-kube-api-access-4s9d4\") pod \"swift-operator-controller-manager-6859f9b676-dmsmk\" (UID: \"989d2b9b-e976-4e58-a506-5d5755154be0\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.106869 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46kjm\" (UniqueName: \"kubernetes.io/projected/b96ea387-f91f-47ef-ba02-4f61e8a750a3-kube-api-access-46kjm\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.118443 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.119906 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.122656 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rpkvb" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.129708 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.140219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6r2\" (UniqueName: \"kubernetes.io/projected/8e79ecec-ac98-42dd-b513-071fbd0e235a-kube-api-access-lw6r2\") pod \"telemetry-operator-controller-manager-769bf6645d-mpcx5\" (UID: \"8e79ecec-ac98-42dd-b513-071fbd0e235a\") " pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.140358 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn2jm\" (UniqueName: \"kubernetes.io/projected/59585305-96e5-40d8-b9f0-37d76e01f40f-kube-api-access-dn2jm\") pod \"test-operator-controller-manager-5cd5cb47d7-mfzz4\" (UID: \"59585305-96e5-40d8-b9f0-37d76e01f40f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.164157 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.165556 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.168098 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8rj5n" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.168331 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.171966 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6r2\" (UniqueName: \"kubernetes.io/projected/8e79ecec-ac98-42dd-b513-071fbd0e235a-kube-api-access-lw6r2\") pod \"telemetry-operator-controller-manager-769bf6645d-mpcx5\" (UID: \"8e79ecec-ac98-42dd-b513-071fbd0e235a\") " pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.175699 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.176475 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.177558 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.180232 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jn2mv" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.197000 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.229462 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.242306 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn2jm\" (UniqueName: \"kubernetes.io/projected/59585305-96e5-40d8-b9f0-37d76e01f40f-kube-api-access-dn2jm\") pod \"test-operator-controller-manager-5cd5cb47d7-mfzz4\" (UID: \"59585305-96e5-40d8-b9f0-37d76e01f40f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.242344 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfjm\" (UniqueName: \"kubernetes.io/projected/92aaf752-fa7f-42da-98a1-298dc1f1f745-kube-api-access-xxfjm\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.242386 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bqd\" (UniqueName: \"kubernetes.io/projected/40106a12-6dd2-492f-94e0-b3ab9e866b81-kube-api-access-65bqd\") pod \"watcher-operator-controller-manager-fcd7d9895-8pxwr\" (UID: \"40106a12-6dd2-492f-94e0-b3ab9e866b81\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.242440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.242472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljzx\" (UniqueName: \"kubernetes.io/projected/2b364096-1561-45dd-9c7c-0e20f76360a6-kube-api-access-qljzx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf\" (UID: \"2b364096-1561-45dd-9c7c-0e20f76360a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.247674 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.268971 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn2jm\" (UniqueName: \"kubernetes.io/projected/59585305-96e5-40d8-b9f0-37d76e01f40f-kube-api-access-dn2jm\") pod \"test-operator-controller-manager-5cd5cb47d7-mfzz4\" (UID: \"59585305-96e5-40d8-b9f0-37d76e01f40f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.278817 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.285523 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.326865 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.344578 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.344641 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.344680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qljzx\" (UniqueName: \"kubernetes.io/projected/2b364096-1561-45dd-9c7c-0e20f76360a6-kube-api-access-qljzx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf\" (UID: \"2b364096-1561-45dd-9c7c-0e20f76360a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.344752 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfjm\" (UniqueName: \"kubernetes.io/projected/92aaf752-fa7f-42da-98a1-298dc1f1f745-kube-api-access-xxfjm\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.344790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bqd\" (UniqueName: \"kubernetes.io/projected/40106a12-6dd2-492f-94e0-b3ab9e866b81-kube-api-access-65bqd\") pod \"watcher-operator-controller-manager-fcd7d9895-8pxwr\" (UID: \"40106a12-6dd2-492f-94e0-b3ab9e866b81\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:11 crc kubenswrapper[4909]: E1002 18:36:11.345152 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 18:36:11 crc kubenswrapper[4909]: E1002 18:36:11.345199 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert podName:92aaf752-fa7f-42da-98a1-298dc1f1f745 nodeName:}" failed. No retries permitted until 2025-10-02 18:36:11.845187034 +0000 UTC m=+1093.032682893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert") pod "openstack-operator-controller-manager-7bffff79d9-6k5xz" (UID: "92aaf752-fa7f-42da-98a1-298dc1f1f745") : secret "webhook-server-cert" not found Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.356436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-bh4kt\" (UID: \"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.385641 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljzx\" (UniqueName: \"kubernetes.io/projected/2b364096-1561-45dd-9c7c-0e20f76360a6-kube-api-access-qljzx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf\" (UID: \"2b364096-1561-45dd-9c7c-0e20f76360a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.391035 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bqd\" (UniqueName: \"kubernetes.io/projected/40106a12-6dd2-492f-94e0-b3ab9e866b81-kube-api-access-65bqd\") pod \"watcher-operator-controller-manager-fcd7d9895-8pxwr\" (UID: \"40106a12-6dd2-492f-94e0-b3ab9e866b81\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.407429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfjm\" (UniqueName: \"kubernetes.io/projected/92aaf752-fa7f-42da-98a1-298dc1f1f745-kube-api-access-xxfjm\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.420193 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.464239 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.473980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.487681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" event={"ID":"c108279d-762f-41ee-a725-55e9f58a8686","Type":"ContainerStarted","Data":"76518614e0c9e9dea1bed48f538f61a7bbb9afe5dfc788b7ab608cf30ea12edd"} Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.537837 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.557276 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b96ea387-f91f-47ef-ba02-4f61e8a750a3-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:11 crc kubenswrapper[4909]: W1002 18:36:11.566709 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848033b4_f22a_4455_8699_45b7761bbee2.slice/crio-87d4325b4dafa199015655a87b79181507c2485dc85a3135826533703d09e38c WatchSource:0}: Error finding container 87d4325b4dafa199015655a87b79181507c2485dc85a3135826533703d09e38c: Status 404 returned error can't find the container with id 87d4325b4dafa199015655a87b79181507c2485dc85a3135826533703d09e38c Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.572285 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b96ea387-f91f-47ef-ba02-4f61e8a750a3-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678btf42\" (UID: \"b96ea387-f91f-47ef-ba02-4f61e8a750a3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.575120 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.600278 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.875756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:11 crc kubenswrapper[4909]: E1002 18:36:11.876606 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 18:36:11 crc kubenswrapper[4909]: E1002 18:36:11.876737 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert podName:92aaf752-fa7f-42da-98a1-298dc1f1f745 nodeName:}" failed. No retries permitted until 2025-10-02 18:36:12.876699224 +0000 UTC m=+1094.064195083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert") pod "openstack-operator-controller-manager-7bffff79d9-6k5xz" (UID: "92aaf752-fa7f-42da-98a1-298dc1f1f745") : secret "webhook-server-cert" not found Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.909575 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp"] Oct 02 18:36:11 crc kubenswrapper[4909]: I1002 18:36:11.918765 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-87dms"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.290365 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-28j82"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.400616 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb"] Oct 02 18:36:12 crc kubenswrapper[4909]: W1002 18:36:12.403622 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba5e23b3_da6b_41ec_9cac_11d4bc3a068b.slice/crio-f481bbe23f69658c77299c7aba011564c12839c1bb797084961617ba9b15f369 WatchSource:0}: Error finding container f481bbe23f69658c77299c7aba011564c12839c1bb797084961617ba9b15f369: Status 404 returned error can't find the container with id f481bbe23f69658c77299c7aba011564c12839c1bb797084961617ba9b15f369 Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.497387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" event={"ID":"f87d6d15-86ff-46e3-8361-447ce9aff98c","Type":"ContainerStarted","Data":"80267a46e76af1a263a6722a93247e6ff3ee4060673f4fecab97a64c07a5e8e3"} Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.498715 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" event={"ID":"27b610f7-4219-4cb1-97b4-76a4627afc7a","Type":"ContainerStarted","Data":"58988144ddbac961c804915e472fe7d2003ca44f57fb539706d7a7df766aac27"} Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.499501 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" event={"ID":"848033b4-f22a-4455-8699-45b7761bbee2","Type":"ContainerStarted","Data":"87d4325b4dafa199015655a87b79181507c2485dc85a3135826533703d09e38c"} Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.505880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" event={"ID":"6f98751d-f257-4445-a739-7ff447f1c3d8","Type":"ContainerStarted","Data":"b06efd082c80e7300219411354e774d03a576e494f69cc336983e2dfc70d166e"} Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.507647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" event={"ID":"ba5e23b3-da6b-41ec-9cac-11d4bc3a068b","Type":"ContainerStarted","Data":"f481bbe23f69658c77299c7aba011564c12839c1bb797084961617ba9b15f369"} Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.766976 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.781260 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.801983 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.810412 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.816475 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.821250 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.829892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.834394 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn"] Oct 02 18:36:12 crc kubenswrapper[4909]: W1002 18:36:12.838694 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a100c33_7cb7_4af6_8262_c6255b252462.slice/crio-59299dae7ee0c2b6691ad1cca49bdb45f56a893f05c7752d42727a0b6326b443 WatchSource:0}: Error finding container 59299dae7ee0c2b6691ad1cca49bdb45f56a893f05c7752d42727a0b6326b443: Status 404 returned error can't find the container with id 59299dae7ee0c2b6691ad1cca49bdb45f56a893f05c7752d42727a0b6326b443 Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.841475 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p"] Oct 02 18:36:12 crc kubenswrapper[4909]: E1002 18:36:12.845247 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jggxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-7d8bb7f44c-d5vkl_openstack-operators(9a100c33-7cb7-4af6-8262-c6255b252462): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.845680 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4"] Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.898528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:12 crc kubenswrapper[4909]: I1002 18:36:12.920277 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92aaf752-fa7f-42da-98a1-298dc1f1f745-cert\") pod \"openstack-operator-controller-manager-7bffff79d9-6k5xz\" (UID: \"92aaf752-fa7f-42da-98a1-298dc1f1f745\") " pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.012298 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:13 crc kubenswrapper[4909]: E1002 18:36:13.060064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" podUID="9a100c33-7cb7-4af6-8262-c6255b252462" Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.206127 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8"] Oct 02 18:36:13 crc kubenswrapper[4909]: W1002 18:36:13.216184 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5198c9b3_5c15_49de_9f3a_da04c80c4eb3.slice/crio-5987d252b2a3c903f047a34ebc04b6ede51e3da4128a52142a0ffb30e2b5cfc0 WatchSource:0}: Error finding container 5987d252b2a3c903f047a34ebc04b6ede51e3da4128a52142a0ffb30e2b5cfc0: Status 404 returned error can't find the container with id 5987d252b2a3c903f047a34ebc04b6ede51e3da4128a52142a0ffb30e2b5cfc0 Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.221158 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp"] Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.233745 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt"] Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.239441 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr"] Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.250901 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42"] Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.268246 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk"] Oct 02 18:36:13 crc kubenswrapper[4909]: W1002 18:36:13.276977 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27e9c1d_e4ec_432f_a2f1_a27cf88ed6e1.slice/crio-37f078fce6af319a509499685dc6b0a7446b882efa02ce318c823b4e1f7f3c96 WatchSource:0}: Error finding container 37f078fce6af319a509499685dc6b0a7446b882efa02ce318c823b4e1f7f3c96: Status 404 returned error can't find the container with id 37f078fce6af319a509499685dc6b0a7446b882efa02ce318c823b4e1f7f3c96 Oct 02 18:36:13 crc kubenswrapper[4909]: W1002 18:36:13.279754 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96ea387_f91f_47ef_ba02_4f61e8a750a3.slice/crio-19815dce6af85a27fd52baaeca106b8c7908c5f8ea7090d58aaedd53ee8d1fd1 WatchSource:0}: Error finding container 19815dce6af85a27fd52baaeca106b8c7908c5f8ea7090d58aaedd53ee8d1fd1: Status 404 returned error can't find the container with id 19815dce6af85a27fd52baaeca106b8c7908c5f8ea7090d58aaedd53ee8d1fd1 Oct 02 18:36:13 crc kubenswrapper[4909]: E1002 18:36:13.326828 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46kjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6f64c4d678btf42_openstack-operators(b96ea387-f91f-47ef-ba02-4f61e8a750a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 18:36:13 crc kubenswrapper[4909]: E1002 18:36:13.330003 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-65bqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-fcd7d9895-8pxwr_openstack-operators(40106a12-6dd2-492f-94e0-b3ab9e866b81): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 18:36:13 crc kubenswrapper[4909]: E1002 18:36:13.330159 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4s9d4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-dmsmk_openstack-operators(989d2b9b-e976-4e58-a506-5d5755154be0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.521998 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz"] Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.527216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" event={"ID":"9a100c33-7cb7-4af6-8262-c6255b252462","Type":"ContainerStarted","Data":"94f4b4d4da437f1399e37f69a3879273b9f139db2d571e7b050ba79a60081d50"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.527269 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" event={"ID":"9a100c33-7cb7-4af6-8262-c6255b252462","Type":"ContainerStarted","Data":"59299dae7ee0c2b6691ad1cca49bdb45f56a893f05c7752d42727a0b6326b443"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.556918 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" event={"ID":"30b85c17-5d08-452e-9099-20e55cc86f9e","Type":"ContainerStarted","Data":"868fa4989f580599190c45367efb83f101d9091bea3cf86370bd9808f2787b42"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.559067 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" event={"ID":"59585305-96e5-40d8-b9f0-37d76e01f40f","Type":"ContainerStarted","Data":"7ef9a41b034c537c5a16ae87ccd12fbc295618e7065793b38115c90a94581e57"} Oct 02 18:36:13 crc kubenswrapper[4909]: E1002 18:36:13.559332 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" podUID="9a100c33-7cb7-4af6-8262-c6255b252462" Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.561613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" event={"ID":"18e949b2-6404-43e9-954b-6a09780bf021","Type":"ContainerStarted","Data":"be8cb3f3a9ffe126a71465e807f1e6497797c4fe7d95677e2471261f6eff7a83"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.565929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" event={"ID":"989d2b9b-e976-4e58-a506-5d5755154be0","Type":"ContainerStarted","Data":"ba48c05cb0c72a9b54b1fd3b4e0db53bb64822fa99220ba522635efea2b7b121"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.569777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" event={"ID":"22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff","Type":"ContainerStarted","Data":"096081c60ba775969d55c82b49ca43f44e3aa8f35058ac38f8fc36db61abc3e5"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.572360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" event={"ID":"6d4671d7-88da-4834-afb5-5deaf7e84cdb","Type":"ContainerStarted","Data":"1a4cec186a5ed7e55f89c654477485d1e92936047d2e23e225ec8d2bbcac650b"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.579660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" event={"ID":"c2bd7e2c-d50f-4808-85f5-4bae9b91d272","Type":"ContainerStarted","Data":"755701192ddd1f54aef8ac7be7d30938d1268ffe10cc786e0198cbd28e9951ce"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.582637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" event={"ID":"5198c9b3-5c15-49de-9f3a-da04c80c4eb3","Type":"ContainerStarted","Data":"5987d252b2a3c903f047a34ebc04b6ede51e3da4128a52142a0ffb30e2b5cfc0"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.584344 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" event={"ID":"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1","Type":"ContainerStarted","Data":"37f078fce6af319a509499685dc6b0a7446b882efa02ce318c823b4e1f7f3c96"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.585877 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" event={"ID":"2b364096-1561-45dd-9c7c-0e20f76360a6","Type":"ContainerStarted","Data":"5eeaa48e86bff7226d5cac4ed4c054d87b45c1e672f6e7c1dd9dfd2faad0c690"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.589536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" event={"ID":"404b32bd-4a96-46d7-b8fa-bcd5fde074aa","Type":"ContainerStarted","Data":"ad0e362dc89d4e07827d00b6fd1849c09cbda27e1590b20afdcea0b1b2190849"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.592275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" event={"ID":"40106a12-6dd2-492f-94e0-b3ab9e866b81","Type":"ContainerStarted","Data":"78b417fcc6fd511f9fd70f8977a9d7ca8dcb92170d10d1e4118a53e4f7edced2"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.601668 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" event={"ID":"8e79ecec-ac98-42dd-b513-071fbd0e235a","Type":"ContainerStarted","Data":"9f682ab52e3b1a36434ad982894ae43599c0a9ff7eccc62094e47eaf080f19ef"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.605975 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" event={"ID":"b96ea387-f91f-47ef-ba02-4f61e8a750a3","Type":"ContainerStarted","Data":"19815dce6af85a27fd52baaeca106b8c7908c5f8ea7090d58aaedd53ee8d1fd1"} Oct 02 18:36:13 crc kubenswrapper[4909]: I1002 18:36:13.625696 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" event={"ID":"d89c2056-bfda-4177-86e6-bc00964d5f22","Type":"ContainerStarted","Data":"6c593683e7a5d9cec2300774dc66bc5a7dd5260c51202d8c9ae8c9545be838db"} Oct 02 18:36:14 crc kubenswrapper[4909]: E1002 18:36:14.618123 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" podUID="9a100c33-7cb7-4af6-8262-c6255b252462" Oct 02 18:36:15 crc kubenswrapper[4909]: E1002 18:36:15.879084 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" podUID="40106a12-6dd2-492f-94e0-b3ab9e866b81" Oct 02 18:36:16 crc kubenswrapper[4909]: W1002 18:36:16.531638 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92aaf752_fa7f_42da_98a1_298dc1f1f745.slice/crio-458b00c24dcaf1117db953ea610acd8362ac977ac9c5c1d1d0cbd773a982e2e0 WatchSource:0}: Error finding container 458b00c24dcaf1117db953ea610acd8362ac977ac9c5c1d1d0cbd773a982e2e0: Status 404 returned error can't find the container with id 458b00c24dcaf1117db953ea610acd8362ac977ac9c5c1d1d0cbd773a982e2e0 Oct 02 18:36:16 crc kubenswrapper[4909]: I1002 18:36:16.635379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" event={"ID":"40106a12-6dd2-492f-94e0-b3ab9e866b81","Type":"ContainerStarted","Data":"c3750edfc137a3cc6be9e705732b6879ce622c19db860ac8dcd50900bd03ab24"} Oct 02 18:36:16 crc kubenswrapper[4909]: I1002 18:36:16.637140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" event={"ID":"92aaf752-fa7f-42da-98a1-298dc1f1f745","Type":"ContainerStarted","Data":"458b00c24dcaf1117db953ea610acd8362ac977ac9c5c1d1d0cbd773a982e2e0"} Oct 02 18:36:16 crc kubenswrapper[4909]: E1002 18:36:16.644985 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" podUID="40106a12-6dd2-492f-94e0-b3ab9e866b81" Oct 02 18:36:16 crc kubenswrapper[4909]: E1002 18:36:16.763292 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" podUID="989d2b9b-e976-4e58-a506-5d5755154be0" Oct 02 18:36:17 crc kubenswrapper[4909]: I1002 18:36:17.650642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" event={"ID":"989d2b9b-e976-4e58-a506-5d5755154be0","Type":"ContainerStarted","Data":"e26774ae901122aa93fbc55588329de1ee07836bb288c309a06e1a0939de0dd7"} Oct 02 18:36:17 crc kubenswrapper[4909]: E1002 18:36:17.652229 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" podUID="40106a12-6dd2-492f-94e0-b3ab9e866b81" Oct 02 18:36:17 crc kubenswrapper[4909]: E1002 18:36:17.652488 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" podUID="989d2b9b-e976-4e58-a506-5d5755154be0" Oct 02 18:36:18 crc kubenswrapper[4909]: E1002 18:36:18.660126 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" podUID="989d2b9b-e976-4e58-a506-5d5755154be0" Oct 02 18:36:29 crc kubenswrapper[4909]: E1002 18:36:29.389985 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54" Oct 02 18:36:29 crc kubenswrapper[4909]: E1002 18:36:29.390607 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ccgxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5c468bf4d4-57hjm_openstack-operators(c2bd7e2c-d50f-4808-85f5-4bae9b91d272): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.042705 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.042981 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rt6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-688db7b6c7-p9sbt_openstack-operators(404b32bd-4a96-46d7-b8fa-bcd5fde074aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.519058 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.519286 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qljzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf_openstack-operators(2b364096-1561-45dd-9c7c-0e20f76360a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.520939 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" podUID="2b364096-1561-45dd-9c7c-0e20f76360a6" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.613263 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.39:5001/openstack-k8s-operators/telemetry-operator:39199b16fb4ef65d07a91339e59f624612b96660" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.613400 4909 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.39:5001/openstack-k8s-operators/telemetry-operator:39199b16fb4ef65d07a91339e59f624612b96660" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.613594 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.39:5001/openstack-k8s-operators/telemetry-operator:39199b16fb4ef65d07a91339e59f624612b96660,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lw6r2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-769bf6645d-mpcx5_openstack-operators(8e79ecec-ac98-42dd-b513-071fbd0e235a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:30 crc kubenswrapper[4909]: E1002 18:36:30.798895 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" podUID="2b364096-1561-45dd-9c7c-0e20f76360a6" Oct 02 18:36:31 crc kubenswrapper[4909]: E1002 18:36:31.089511 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7" Oct 02 18:36:31 crc kubenswrapper[4909]: E1002 18:36:31.089696 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pjmzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7f55849f88-9mjtf_openstack-operators(6d4671d7-88da-4834-afb5-5deaf7e84cdb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:31 crc kubenswrapper[4909]: E1002 18:36:31.573537 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475" Oct 02 18:36:31 crc kubenswrapper[4909]: E1002 18:36:31.573783 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wj6lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-5fbf469cd7-bh4kt_openstack-operators(d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:32 crc kubenswrapper[4909]: E1002 18:36:32.019139 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0" Oct 02 18:36:32 crc kubenswrapper[4909]: E1002 18:36:32.019302 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fxpxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6fd6854b49-hwgrp_openstack-operators(22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:32 crc kubenswrapper[4909]: E1002 18:36:32.429705 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988" Oct 02 18:36:32 crc kubenswrapper[4909]: E1002 18:36:32.430014 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8ntj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6574bf987d-94lx8_openstack-operators(5198c9b3-5c15-49de-9f3a-da04c80c4eb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:32 crc kubenswrapper[4909]: E1002 18:36:32.959889 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01" Oct 02 18:36:32 crc kubenswrapper[4909]: E1002 18:36:32.960231 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxhmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-59d6cfdf45-g2lmn_openstack-operators(d89c2056-bfda-4177-86e6-bc00964d5f22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:36:34 crc kubenswrapper[4909]: E1002 18:36:34.386700 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" podUID="b96ea387-f91f-47ef-ba02-4f61e8a750a3" Oct 02 18:36:34 crc kubenswrapper[4909]: I1002 18:36:34.827296 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" event={"ID":"b96ea387-f91f-47ef-ba02-4f61e8a750a3","Type":"ContainerStarted","Data":"2b3791d32d677ad2817b63355d957d206cc411297e8f687583b64ea5fbede383"} Oct 02 18:36:35 crc kubenswrapper[4909]: I1002 18:36:35.837205 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" event={"ID":"6f98751d-f257-4445-a739-7ff447f1c3d8","Type":"ContainerStarted","Data":"9b77666a98870368a67a5c9cbf28ca0ddde9b3715fd3df0402de7ae8584f57e5"} Oct 02 18:36:35 crc kubenswrapper[4909]: I1002 18:36:35.839274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" event={"ID":"92aaf752-fa7f-42da-98a1-298dc1f1f745","Type":"ContainerStarted","Data":"44716f867659b73f229487ae249d8ddeb7790c822c282fdba30dd9013d2a094d"} Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.499683 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" podUID="404b32bd-4a96-46d7-b8fa-bcd5fde074aa" Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.598637 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" podUID="c2bd7e2c-d50f-4808-85f5-4bae9b91d272" Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.641190 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" podUID="6d4671d7-88da-4834-afb5-5deaf7e84cdb" Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.674109 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" podUID="d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1" Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.863258 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" event={"ID":"18e949b2-6404-43e9-954b-6a09780bf021","Type":"ContainerStarted","Data":"49c97eabf4c472ffd469c1352e2e770b4ed8974e59508be140ddc939e1ca0248"} Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.877304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" event={"ID":"404b32bd-4a96-46d7-b8fa-bcd5fde074aa","Type":"ContainerStarted","Data":"78ecf4ab89e4ef7c967fbfe4084e492afb070c572fa97f9a8851d956a5e03659"} Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.882018 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" podUID="404b32bd-4a96-46d7-b8fa-bcd5fde074aa" Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.887935 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" event={"ID":"27b610f7-4219-4cb1-97b4-76a4627afc7a","Type":"ContainerStarted","Data":"d62c68c819b48202b01174420c984456fb73651c55e7369a2fb7183e262101c9"} Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.893165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" event={"ID":"6d4671d7-88da-4834-afb5-5deaf7e84cdb","Type":"ContainerStarted","Data":"e0f49cb63fa000c6e2f1f918f7323ea8495a9d5d71c3a388711f95579bab9da2"} Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.895466 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" podUID="6d4671d7-88da-4834-afb5-5deaf7e84cdb" Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.916394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" event={"ID":"c2bd7e2c-d50f-4808-85f5-4bae9b91d272","Type":"ContainerStarted","Data":"eb23c7474dff9d84e39471b95bfc9ac37407879950b1fc6f6f90f75b02f2d137"} Oct 02 18:36:36 crc kubenswrapper[4909]: E1002 18:36:36.919196 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" podUID="c2bd7e2c-d50f-4808-85f5-4bae9b91d272" Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.925217 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" event={"ID":"ba5e23b3-da6b-41ec-9cac-11d4bc3a068b","Type":"ContainerStarted","Data":"6075b90e8c3a4e40a67da40fcff7ceefb7e5e18d829ca7ed393c298791ac13a6"} Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.955250 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" event={"ID":"30b85c17-5d08-452e-9099-20e55cc86f9e","Type":"ContainerStarted","Data":"74a075db818c2a9bf0378e9b3fd3d28edcd1d814889aee50703f12a9d462210e"} Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.962386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" event={"ID":"59585305-96e5-40d8-b9f0-37d76e01f40f","Type":"ContainerStarted","Data":"d3c75a9f87bf0750d428be496feae4120405d30a2d6f7ae9a041d686f61fd840"} Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.979425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" event={"ID":"c108279d-762f-41ee-a725-55e9f58a8686","Type":"ContainerStarted","Data":"e2b2a4515804fda7c3acf41857f680c8d565ca8534bf8bc580fb259c6fb1a3d5"} Oct 02 18:36:36 crc kubenswrapper[4909]: I1002 18:36:36.997770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" event={"ID":"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1","Type":"ContainerStarted","Data":"d7fb27c682111f53f369fb03a3e2cf23bc556293545fab3a98af611cf760f787"} Oct 02 18:36:37 crc kubenswrapper[4909]: E1002 18:36:37.008542 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" podUID="d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1" Oct 02 18:36:37 crc kubenswrapper[4909]: I1002 18:36:37.023677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" event={"ID":"f87d6d15-86ff-46e3-8361-447ce9aff98c","Type":"ContainerStarted","Data":"ce7c7bbf8e57f941aa0b362b8ec8aac9cf22395387c16f203880e5c1e4e66ecf"} Oct 02 18:36:37 crc kubenswrapper[4909]: E1002 18:36:37.277632 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" podUID="5198c9b3-5c15-49de-9f3a-da04c80c4eb3" Oct 02 18:36:37 crc kubenswrapper[4909]: E1002 18:36:37.352966 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" podUID="22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff" Oct 02 18:36:37 crc kubenswrapper[4909]: E1002 18:36:37.365762 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" podUID="d89c2056-bfda-4177-86e6-bc00964d5f22" Oct 02 18:36:37 crc kubenswrapper[4909]: E1002 18:36:37.443589 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" podUID="8e79ecec-ac98-42dd-b513-071fbd0e235a" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.076185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" event={"ID":"f87d6d15-86ff-46e3-8361-447ce9aff98c","Type":"ContainerStarted","Data":"8241f9e39ce41c504b0e5a7aa004f72414c40ddbd5575f4467e273b014854dd5"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.076547 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.099880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" event={"ID":"989d2b9b-e976-4e58-a506-5d5755154be0","Type":"ContainerStarted","Data":"5ec9c5dd018f2f4fde5b35793c907bb68ef2be17f61a0ac8ebcbf96d22609f03"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.100884 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.111016 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" event={"ID":"5198c9b3-5c15-49de-9f3a-da04c80c4eb3","Type":"ContainerStarted","Data":"4f73ff6407b5ba920a59fbfda3e07aa67ccf20f7afbdedf12340b6385d1e1548"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.112368 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" podStartSLOduration=7.460644547 podStartE2EDuration="28.112348064s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.293925677 +0000 UTC m=+1093.481421546" lastFinishedPulling="2025-10-02 18:36:32.945629204 +0000 UTC m=+1114.133125063" observedRunningTime="2025-10-02 18:36:38.107215765 +0000 UTC m=+1119.294711634" watchObservedRunningTime="2025-10-02 18:36:38.112348064 +0000 UTC m=+1119.299843923" Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.115194 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" podUID="5198c9b3-5c15-49de-9f3a-da04c80c4eb3" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.128372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" event={"ID":"8e79ecec-ac98-42dd-b513-071fbd0e235a","Type":"ContainerStarted","Data":"a23ef7438af9f5a366db444d6e510c4cb3d05dd6d4cd5f7ac5590aee21f0c971"} Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.131995 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.39:5001/openstack-k8s-operators/telemetry-operator:39199b16fb4ef65d07a91339e59f624612b96660\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" podUID="8e79ecec-ac98-42dd-b513-071fbd0e235a" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.139267 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" event={"ID":"22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff","Type":"ContainerStarted","Data":"737852cff3e2caf7a3aa1eddb8e0f61768cde5de170cc6e69392b19069ed4b26"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.142215 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" podStartSLOduration=5.606532997 podStartE2EDuration="28.142199143s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:13.330082503 +0000 UTC m=+1094.517578362" lastFinishedPulling="2025-10-02 18:36:35.865748649 +0000 UTC m=+1117.053244508" observedRunningTime="2025-10-02 18:36:38.139763527 +0000 UTC m=+1119.327259386" watchObservedRunningTime="2025-10-02 18:36:38.142199143 +0000 UTC m=+1119.329695002" Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.149533 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" podUID="22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.151510 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" event={"ID":"848033b4-f22a-4455-8699-45b7761bbee2","Type":"ContainerStarted","Data":"b56afaac15f30354dd6a661ce9765fc05a86048d4595b469b8383a234ec4f09f"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.151558 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" event={"ID":"848033b4-f22a-4455-8699-45b7761bbee2","Type":"ContainerStarted","Data":"551a797a29c2f958b3ac3749e9f73e4bb0772d20bf9b80742286655c4feaea5c"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.151791 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.157085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" event={"ID":"30b85c17-5d08-452e-9099-20e55cc86f9e","Type":"ContainerStarted","Data":"5cec5ac5c0608d942766ca9ae1b0adaea06a511e0739f07ed5482a31698a8564"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.158059 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.173227 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" event={"ID":"ba5e23b3-da6b-41ec-9cac-11d4bc3a068b","Type":"ContainerStarted","Data":"4b23afd821472a7bbc08d158cb762b417ddcc43e76979b942838aa184a3152d7"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.174185 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.182600 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" event={"ID":"18e949b2-6404-43e9-954b-6a09780bf021","Type":"ContainerStarted","Data":"643ca0af021775d0163221af0e7a853255a412602279f4bc186226985d914744"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.183536 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.192087 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" event={"ID":"92aaf752-fa7f-42da-98a1-298dc1f1f745","Type":"ContainerStarted","Data":"660278a131ac3cc527a2cfbe8d681e90cfbb4c3a22d4288abac3e2468ed26290"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.192337 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.203318 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" event={"ID":"9a100c33-7cb7-4af6-8262-c6255b252462","Type":"ContainerStarted","Data":"94140cc7bb74bd107e4fe8839720c9c2f83f8433f79b78ac8103c5489860c366"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.204669 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.227698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" event={"ID":"40106a12-6dd2-492f-94e0-b3ab9e866b81","Type":"ContainerStarted","Data":"9527a1a3c6d8eb5f3a0983b62cf41888f19330c177ed97adb42b479777821d4d"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.228533 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.243062 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" event={"ID":"27b610f7-4219-4cb1-97b4-76a4627afc7a","Type":"ContainerStarted","Data":"7fcffd7016a839ec098bfed8a4cf2cbcd36e74893eb5ae481ec3d0ad7898c3bf"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.244091 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.250503 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" podStartSLOduration=7.715698646 podStartE2EDuration="28.250487293s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.406978806 +0000 UTC m=+1093.594474655" lastFinishedPulling="2025-10-02 18:36:32.941767443 +0000 UTC m=+1114.129263302" observedRunningTime="2025-10-02 18:36:38.24266562 +0000 UTC m=+1119.430161479" watchObservedRunningTime="2025-10-02 18:36:38.250487293 +0000 UTC m=+1119.437983152" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.252573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" event={"ID":"6f98751d-f257-4445-a739-7ff447f1c3d8","Type":"ContainerStarted","Data":"fe9a0566061bdcaba1b5137ef84b2efc1995a4d136cf7761776eda249a5cfa4a"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.252653 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.259986 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" event={"ID":"59585305-96e5-40d8-b9f0-37d76e01f40f","Type":"ContainerStarted","Data":"212d9cb962a606583186ed6c13a76a5c09b3ef6a348113746c37a4357fde8a54"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.260188 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.263381 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" event={"ID":"c108279d-762f-41ee-a725-55e9f58a8686","Type":"ContainerStarted","Data":"b2edce756a3682c40cf9c9eb171f1cb028505eac29667dde008ff204f79339dc"} Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.263829 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.265646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" event={"ID":"d89c2056-bfda-4177-86e6-bc00964d5f22","Type":"ContainerStarted","Data":"137d2568bed75237c7a3afe2db63bbee07a039d2b6bcaa74b143eac020b9f85f"} Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.269255 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" podUID="c2bd7e2c-d50f-4808-85f5-4bae9b91d272" Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.269295 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" podUID="d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1" Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.269329 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" podUID="d89c2056-bfda-4177-86e6-bc00964d5f22" Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.269355 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" podUID="6d4671d7-88da-4834-afb5-5deaf7e84cdb" Oct 02 18:36:38 crc kubenswrapper[4909]: E1002 18:36:38.269391 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" podUID="404b32bd-4a96-46d7-b8fa-bcd5fde074aa" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.278700 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" podStartSLOduration=5.303296377 podStartE2EDuration="28.27867505s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.845017305 +0000 UTC m=+1094.032513164" lastFinishedPulling="2025-10-02 18:36:35.820395968 +0000 UTC m=+1117.007891837" observedRunningTime="2025-10-02 18:36:38.275306955 +0000 UTC m=+1119.462802814" watchObservedRunningTime="2025-10-02 18:36:38.27867505 +0000 UTC m=+1119.466170939" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.308259 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" podStartSLOduration=5.86664324 podStartE2EDuration="28.30823482s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.828248024 +0000 UTC m=+1094.015743883" lastFinishedPulling="2025-10-02 18:36:35.269839584 +0000 UTC m=+1116.457335463" observedRunningTime="2025-10-02 18:36:38.296269377 +0000 UTC m=+1119.483765236" watchObservedRunningTime="2025-10-02 18:36:38.30823482 +0000 UTC m=+1119.495730679" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.341287 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" podStartSLOduration=5.8941944280000005 podStartE2EDuration="28.341255078s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.82295405 +0000 UTC m=+1094.010449909" lastFinishedPulling="2025-10-02 18:36:35.2700147 +0000 UTC m=+1116.457510559" observedRunningTime="2025-10-02 18:36:38.340475243 +0000 UTC m=+1119.527971102" watchObservedRunningTime="2025-10-02 18:36:38.341255078 +0000 UTC m=+1119.528750937" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.350174 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" podStartSLOduration=4.653029539 podStartE2EDuration="28.350155145s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:11.571884662 +0000 UTC m=+1092.759380521" lastFinishedPulling="2025-10-02 18:36:35.269010268 +0000 UTC m=+1116.456506127" observedRunningTime="2025-10-02 18:36:38.32237552 +0000 UTC m=+1119.509871379" watchObservedRunningTime="2025-10-02 18:36:38.350155145 +0000 UTC m=+1119.537650994" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.376253 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" podStartSLOduration=28.376238007 podStartE2EDuration="28.376238007s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:36:38.373650576 +0000 UTC m=+1119.561146435" watchObservedRunningTime="2025-10-02 18:36:38.376238007 +0000 UTC m=+1119.563733866" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.396800 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" podStartSLOduration=7.693922462 podStartE2EDuration="28.396784626s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:11.312647984 +0000 UTC m=+1092.500143833" lastFinishedPulling="2025-10-02 18:36:32.015510138 +0000 UTC m=+1113.203005997" observedRunningTime="2025-10-02 18:36:38.393581197 +0000 UTC m=+1119.581077056" watchObservedRunningTime="2025-10-02 18:36:38.396784626 +0000 UTC m=+1119.584280485" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.420047 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" podStartSLOduration=5.986686295 podStartE2EDuration="28.420022459s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.833619731 +0000 UTC m=+1094.021115590" lastFinishedPulling="2025-10-02 18:36:35.266955885 +0000 UTC m=+1116.454451754" observedRunningTime="2025-10-02 18:36:38.415785207 +0000 UTC m=+1119.603281066" watchObservedRunningTime="2025-10-02 18:36:38.420022459 +0000 UTC m=+1119.607518318" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.441836 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" podStartSLOduration=6.917361437 podStartE2EDuration="28.441815737s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.071819902 +0000 UTC m=+1093.259315761" lastFinishedPulling="2025-10-02 18:36:33.596274202 +0000 UTC m=+1114.783770061" observedRunningTime="2025-10-02 18:36:38.430483075 +0000 UTC m=+1119.617978944" watchObservedRunningTime="2025-10-02 18:36:38.441815737 +0000 UTC m=+1119.629311596" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.549608 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" podStartSLOduration=7.673278382 podStartE2EDuration="28.549587011s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.063962708 +0000 UTC m=+1093.251458567" lastFinishedPulling="2025-10-02 18:36:32.940271337 +0000 UTC m=+1114.127767196" observedRunningTime="2025-10-02 18:36:38.546882257 +0000 UTC m=+1119.734378116" watchObservedRunningTime="2025-10-02 18:36:38.549587011 +0000 UTC m=+1119.737082870" Oct 02 18:36:38 crc kubenswrapper[4909]: I1002 18:36:38.579651 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" podStartSLOduration=6.089173967 podStartE2EDuration="28.579627886s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:13.329839916 +0000 UTC m=+1094.517335775" lastFinishedPulling="2025-10-02 18:36:35.820293835 +0000 UTC m=+1117.007789694" observedRunningTime="2025-10-02 18:36:38.565554548 +0000 UTC m=+1119.753050417" watchObservedRunningTime="2025-10-02 18:36:38.579627886 +0000 UTC m=+1119.767123745" Oct 02 18:36:39 crc kubenswrapper[4909]: E1002 18:36:39.281545 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" podUID="d89c2056-bfda-4177-86e6-bc00964d5f22" Oct 02 18:36:39 crc kubenswrapper[4909]: E1002 18:36:39.281863 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" podUID="5198c9b3-5c15-49de-9f3a-da04c80c4eb3" Oct 02 18:36:39 crc kubenswrapper[4909]: E1002 18:36:39.285564 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.39:5001/openstack-k8s-operators/telemetry-operator:39199b16fb4ef65d07a91339e59f624612b96660\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" podUID="8e79ecec-ac98-42dd-b513-071fbd0e235a" Oct 02 18:36:39 crc kubenswrapper[4909]: E1002 18:36:39.285803 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fdb7ea8542adb2eca73f11bd78e6aebceed2ba7a1e9fdd149c75e0049d09ce0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" podUID="22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff" Oct 02 18:36:39 crc kubenswrapper[4909]: I1002 18:36:39.288306 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7bffff79d9-6k5xz" Oct 02 18:36:40 crc kubenswrapper[4909]: I1002 18:36:40.289315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" event={"ID":"b96ea387-f91f-47ef-ba02-4f61e8a750a3","Type":"ContainerStarted","Data":"013093333c1dc058bd17102b16b4c5babf8fb5d7519bb01d9af8c0597f80731a"} Oct 02 18:36:40 crc kubenswrapper[4909]: I1002 18:36:40.291246 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:40 crc kubenswrapper[4909]: I1002 18:36:40.325262 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" podStartSLOduration=4.23604282 podStartE2EDuration="30.32524114s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:13.289734291 +0000 UTC m=+1094.477230150" lastFinishedPulling="2025-10-02 18:36:39.378932611 +0000 UTC m=+1120.566428470" observedRunningTime="2025-10-02 18:36:40.319771929 +0000 UTC m=+1121.507267798" watchObservedRunningTime="2025-10-02 18:36:40.32524114 +0000 UTC m=+1121.512736999" Oct 02 18:36:40 crc kubenswrapper[4909]: I1002 18:36:40.640894 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9jlbp" Oct 02 18:36:41 crc kubenswrapper[4909]: I1002 18:36:41.025586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-886ls" Oct 02 18:36:41 crc kubenswrapper[4909]: I1002 18:36:41.422753 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mfzz4" Oct 02 18:36:46 crc kubenswrapper[4909]: I1002 18:36:46.343178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" event={"ID":"2b364096-1561-45dd-9c7c-0e20f76360a6","Type":"ContainerStarted","Data":"e555b3494011ca9392a4f58e2607148e04b46da71eeeed449066a75cf56f3c96"} Oct 02 18:36:46 crc kubenswrapper[4909]: I1002 18:36:46.367873 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf" podStartSLOduration=2.354050098 podStartE2EDuration="35.367852168s" podCreationTimestamp="2025-10-02 18:36:11 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.818760529 +0000 UTC m=+1094.006256388" lastFinishedPulling="2025-10-02 18:36:45.832562609 +0000 UTC m=+1127.020058458" observedRunningTime="2025-10-02 18:36:46.362888754 +0000 UTC m=+1127.550384613" watchObservedRunningTime="2025-10-02 18:36:46.367852168 +0000 UTC m=+1127.555348027" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.379493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" event={"ID":"404b32bd-4a96-46d7-b8fa-bcd5fde074aa","Type":"ContainerStarted","Data":"e3e0783d07bb9652d1db228fab58e5c55a1867b144ce646dfad3164c6ef7e9ad"} Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.381081 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.399516 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" podStartSLOduration=3.120497071 podStartE2EDuration="40.399489344s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.818401589 +0000 UTC m=+1094.005897448" lastFinishedPulling="2025-10-02 18:36:50.097393863 +0000 UTC m=+1131.284889721" observedRunningTime="2025-10-02 18:36:50.396664076 +0000 UTC m=+1131.584159945" watchObservedRunningTime="2025-10-02 18:36:50.399489344 +0000 UTC m=+1131.586985203" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.564111 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-fz856" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.587005 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-2lwdw" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.732687 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-87dms" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.819328 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-28j82" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.847069 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-hkdrb" Oct 02 18:36:50 crc kubenswrapper[4909]: I1002 18:36:50.921099 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-zxm4p" Oct 02 18:36:51 crc kubenswrapper[4909]: I1002 18:36:51.235206 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-d5vkl" Oct 02 18:36:51 crc kubenswrapper[4909]: I1002 18:36:51.283613 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-dmsmk" Oct 02 18:36:51 crc kubenswrapper[4909]: I1002 18:36:51.470933 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-8pxwr" Oct 02 18:36:51 crc kubenswrapper[4909]: I1002 18:36:51.618395 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678btf42" Oct 02 18:36:53 crc kubenswrapper[4909]: I1002 18:36:53.054493 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:36:53 crc kubenswrapper[4909]: I1002 18:36:53.054589 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:36:55 crc kubenswrapper[4909]: I1002 18:36:55.424766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" event={"ID":"c2bd7e2c-d50f-4808-85f5-4bae9b91d272","Type":"ContainerStarted","Data":"a39d76d53b9c7b9c11c3000ccfd5761155034b07c67651ebdf520180cec661e9"} Oct 02 18:36:56 crc kubenswrapper[4909]: I1002 18:36:56.435106 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:36:57 crc kubenswrapper[4909]: I1002 18:36:57.446991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" event={"ID":"8e79ecec-ac98-42dd-b513-071fbd0e235a","Type":"ContainerStarted","Data":"c077e347c82ada482e085a4153401673c2e5cf52c073579acb4d5cb79936e14a"} Oct 02 18:36:57 crc kubenswrapper[4909]: I1002 18:36:57.447302 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:36:57 crc kubenswrapper[4909]: I1002 18:36:57.471328 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" podStartSLOduration=3.614133082 podStartE2EDuration="47.471303702s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.818165421 +0000 UTC m=+1094.005661280" lastFinishedPulling="2025-10-02 18:36:56.675336041 +0000 UTC m=+1137.862831900" observedRunningTime="2025-10-02 18:36:57.469198396 +0000 UTC m=+1138.656694345" watchObservedRunningTime="2025-10-02 18:36:57.471303702 +0000 UTC m=+1138.658799601" Oct 02 18:36:57 crc kubenswrapper[4909]: I1002 18:36:57.479904 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" podStartSLOduration=9.070743704 podStartE2EDuration="47.479884788s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.812839736 +0000 UTC m=+1094.000335595" lastFinishedPulling="2025-10-02 18:36:51.22198078 +0000 UTC m=+1132.409476679" observedRunningTime="2025-10-02 18:36:56.461254308 +0000 UTC m=+1137.648750187" watchObservedRunningTime="2025-10-02 18:36:57.479884788 +0000 UTC m=+1138.667380687" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.457425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" event={"ID":"5198c9b3-5c15-49de-9f3a-da04c80c4eb3","Type":"ContainerStarted","Data":"a3fccdfe8a4684d7f3e41e213a50bcb93f8cd2fd8284226bd6ed489c0ff1fab1"} Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.457856 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.459900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" event={"ID":"22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff","Type":"ContainerStarted","Data":"cf9971cd26a08597495a30417ac35b03739986a52dde055a48c97ab481ef72d6"} Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.460096 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.461633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" event={"ID":"d89c2056-bfda-4177-86e6-bc00964d5f22","Type":"ContainerStarted","Data":"1e1d7f94c60085b93db432638fb85296c8ee75d013c925bffd605503c3b81ee9"} Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.462097 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.464061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" event={"ID":"6d4671d7-88da-4834-afb5-5deaf7e84cdb","Type":"ContainerStarted","Data":"3b38a591dff080ae24559bafcfc647362aee6eed27deabee6c7dfa2c3153e87f"} Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.464526 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.466651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" event={"ID":"d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1","Type":"ContainerStarted","Data":"ae5e29f11a2ab58cc858452c2098379c90bcf6bef0cd690a6016566a926aefca"} Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.467077 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.501319 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" podStartSLOduration=3.478494461 podStartE2EDuration="48.501302596s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.830761753 +0000 UTC m=+1094.018257612" lastFinishedPulling="2025-10-02 18:36:57.853569868 +0000 UTC m=+1139.041065747" observedRunningTime="2025-10-02 18:36:58.498382615 +0000 UTC m=+1139.685878474" watchObservedRunningTime="2025-10-02 18:36:58.501302596 +0000 UTC m=+1139.688798455" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.503375 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" podStartSLOduration=3.884045131 podStartE2EDuration="48.50336213s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:13.2356018 +0000 UTC m=+1094.423097669" lastFinishedPulling="2025-10-02 18:36:57.854918789 +0000 UTC m=+1139.042414668" observedRunningTime="2025-10-02 18:36:58.483743719 +0000 UTC m=+1139.671239578" watchObservedRunningTime="2025-10-02 18:36:58.50336213 +0000 UTC m=+1139.690857989" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.518017 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" podStartSLOduration=3.949014398 podStartE2EDuration="48.518001674s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:13.285308644 +0000 UTC m=+1094.472804503" lastFinishedPulling="2025-10-02 18:36:57.8542959 +0000 UTC m=+1139.041791779" observedRunningTime="2025-10-02 18:36:58.516392235 +0000 UTC m=+1139.703888114" watchObservedRunningTime="2025-10-02 18:36:58.518001674 +0000 UTC m=+1139.705497533" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.538562 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" podStartSLOduration=3.931641052 podStartE2EDuration="48.538546655s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:13.249279055 +0000 UTC m=+1094.436774914" lastFinishedPulling="2025-10-02 18:36:57.856184648 +0000 UTC m=+1139.043680517" observedRunningTime="2025-10-02 18:36:58.534119136 +0000 UTC m=+1139.721614995" watchObservedRunningTime="2025-10-02 18:36:58.538546655 +0000 UTC m=+1139.726042514" Oct 02 18:36:58 crc kubenswrapper[4909]: I1002 18:36:58.555205 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" podStartSLOduration=3.544931907 podStartE2EDuration="48.555191552s" podCreationTimestamp="2025-10-02 18:36:10 +0000 UTC" firstStartedPulling="2025-10-02 18:36:12.835979004 +0000 UTC m=+1094.023474863" lastFinishedPulling="2025-10-02 18:36:57.846238649 +0000 UTC m=+1139.033734508" observedRunningTime="2025-10-02 18:36:58.551654423 +0000 UTC m=+1139.739150282" watchObservedRunningTime="2025-10-02 18:36:58.555191552 +0000 UTC m=+1139.742687411" Oct 02 18:37:01 crc kubenswrapper[4909]: I1002 18:37:01.042655 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-57hjm" Oct 02 18:37:01 crc kubenswrapper[4909]: I1002 18:37:01.251487 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-p9sbt" Oct 02 18:37:01 crc kubenswrapper[4909]: I1002 18:37:01.329693 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-769bf6645d-mpcx5" Oct 02 18:37:10 crc kubenswrapper[4909]: I1002 18:37:10.876182 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-g2lmn" Oct 02 18:37:10 crc kubenswrapper[4909]: I1002 18:37:10.991904 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-hwgrp" Oct 02 18:37:11 crc kubenswrapper[4909]: I1002 18:37:11.012112 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-9mjtf" Oct 02 18:37:11 crc kubenswrapper[4909]: I1002 18:37:11.062272 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-94lx8" Oct 02 18:37:11 crc kubenswrapper[4909]: I1002 18:37:11.583142 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-bh4kt" Oct 02 18:37:23 crc kubenswrapper[4909]: I1002 18:37:23.054971 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:37:23 crc kubenswrapper[4909]: I1002 18:37:23.057181 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.191369 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vd4m4"] Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.193681 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.200366 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.203548 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.205822 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.206045 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-g65zs" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.209958 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vd4m4"] Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.334603 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9kmv"] Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.336394 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.341498 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.348773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/0c592261-2177-495a-93d7-f9086de5ec3b-kube-api-access-gnt8b\") pod \"dnsmasq-dns-675f4bcbfc-vd4m4\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.348885 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c592261-2177-495a-93d7-f9086de5ec3b-config\") pod \"dnsmasq-dns-675f4bcbfc-vd4m4\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.354558 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9kmv"] Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.450084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.450161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-config\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.450202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/0c592261-2177-495a-93d7-f9086de5ec3b-kube-api-access-gnt8b\") pod \"dnsmasq-dns-675f4bcbfc-vd4m4\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.450355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258n8\" (UniqueName: \"kubernetes.io/projected/4db3b229-071d-42d7-81bf-44f3221e4cd1-kube-api-access-258n8\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.450563 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c592261-2177-495a-93d7-f9086de5ec3b-config\") pod \"dnsmasq-dns-675f4bcbfc-vd4m4\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.451590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c592261-2177-495a-93d7-f9086de5ec3b-config\") pod \"dnsmasq-dns-675f4bcbfc-vd4m4\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.471251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/0c592261-2177-495a-93d7-f9086de5ec3b-kube-api-access-gnt8b\") pod \"dnsmasq-dns-675f4bcbfc-vd4m4\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.511696 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.552407 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-config\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.552795 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258n8\" (UniqueName: \"kubernetes.io/projected/4db3b229-071d-42d7-81bf-44f3221e4cd1-kube-api-access-258n8\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.552902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.553931 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.555636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-config\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.573613 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258n8\" (UniqueName: \"kubernetes.io/projected/4db3b229-071d-42d7-81bf-44f3221e4cd1-kube-api-access-258n8\") pod \"dnsmasq-dns-78dd6ddcc-w9kmv\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.660584 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:37:35 crc kubenswrapper[4909]: I1002 18:37:35.861748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vd4m4"] Oct 02 18:37:35 crc kubenswrapper[4909]: W1002 18:37:35.879481 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c592261_2177_495a_93d7_f9086de5ec3b.slice/crio-daa259990569ed481903e9b34efa111b7dde67cf192ab4795aa20f4e6dcfe096 WatchSource:0}: Error finding container daa259990569ed481903e9b34efa111b7dde67cf192ab4795aa20f4e6dcfe096: Status 404 returned error can't find the container with id daa259990569ed481903e9b34efa111b7dde67cf192ab4795aa20f4e6dcfe096 Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.017261 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vd4m4"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.054501 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gqv78"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.058383 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.067555 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gqv78"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.162243 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfgr9\" (UniqueName: \"kubernetes.io/projected/fed94911-9861-4b29-9f39-cfd3be4bc01a-kube-api-access-lfgr9\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.162298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-config\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.162321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.264243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-config\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.264325 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.264452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfgr9\" (UniqueName: \"kubernetes.io/projected/fed94911-9861-4b29-9f39-cfd3be4bc01a-kube-api-access-lfgr9\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.265258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-config\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.265487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.286896 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfgr9\" (UniqueName: \"kubernetes.io/projected/fed94911-9861-4b29-9f39-cfd3be4bc01a-kube-api-access-lfgr9\") pod \"dnsmasq-dns-5ccc8479f9-gqv78\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.341417 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gqv78"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.341991 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.348618 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9kmv"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.376669 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jz7g2"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.377867 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.407534 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jz7g2"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.470730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-config\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.470799 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.470857 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ppr6\" (UniqueName: \"kubernetes.io/projected/ef353dfc-8f35-483c-ad2a-2df34420381f-kube-api-access-2ppr6\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.572145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-config\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.572539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.572595 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ppr6\" (UniqueName: \"kubernetes.io/projected/ef353dfc-8f35-483c-ad2a-2df34420381f-kube-api-access-2ppr6\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.573058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-config\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.573668 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.593745 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ppr6\" (UniqueName: \"kubernetes.io/projected/ef353dfc-8f35-483c-ad2a-2df34420381f-kube-api-access-2ppr6\") pod \"dnsmasq-dns-57d769cc4f-jz7g2\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.718436 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.884645 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gqv78"] Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.887106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" event={"ID":"4db3b229-071d-42d7-81bf-44f3221e4cd1","Type":"ContainerStarted","Data":"82091f3f9b14741f3989017fbe9e2d00dd5dd68081c1c48b9ebc23cf72a457bd"} Oct 02 18:37:36 crc kubenswrapper[4909]: I1002 18:37:36.889901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" event={"ID":"0c592261-2177-495a-93d7-f9086de5ec3b","Type":"ContainerStarted","Data":"daa259990569ed481903e9b34efa111b7dde67cf192ab4795aa20f4e6dcfe096"} Oct 02 18:37:36 crc kubenswrapper[4909]: W1002 18:37:36.910218 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed94911_9861_4b29_9f39_cfd3be4bc01a.slice/crio-93092a93669cb31fe82af1d9001235264d42c1a12f9b368359d758112c354613 WatchSource:0}: Error finding container 93092a93669cb31fe82af1d9001235264d42c1a12f9b368359d758112c354613: Status 404 returned error can't find the container with id 93092a93669cb31fe82af1d9001235264d42c1a12f9b368359d758112c354613 Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.203614 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jz7g2"] Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.212512 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.213828 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.221289 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.221316 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.221521 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xdbxc" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.221593 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.221882 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.222068 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.222245 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.231198 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284324 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284649 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d73c266b-a3db-431b-a40f-f0a5b9d06610-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284675 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284805 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284870 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d73c266b-a3db-431b-a40f-f0a5b9d06610-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.284892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.285005 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqlx\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-kube-api-access-mfqlx\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.285058 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.285084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386708 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d73c266b-a3db-431b-a40f-f0a5b9d06610-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386816 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386882 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d73c266b-a3db-431b-a40f-f0a5b9d06610-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386917 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqlx\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-kube-api-access-mfqlx\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.386961 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.387014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.387524 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.387596 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.387775 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.389453 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.389457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.389795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.393691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d73c266b-a3db-431b-a40f-f0a5b9d06610-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.396888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.400476 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d73c266b-a3db-431b-a40f-f0a5b9d06610-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.401570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.405733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqlx\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-kube-api-access-mfqlx\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.409294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.512013 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.513881 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.523964 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.524215 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.524907 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.525184 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.525313 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-f6kjx" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.525416 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.525518 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.525667 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.550722 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589435 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589488 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589587 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589617 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589684 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6rwb\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-kube-api-access-t6rwb\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.589969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-config-data\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.691641 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.691980 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6rwb\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-kube-api-access-t6rwb\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692011 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-config-data\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692283 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692352 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.692376 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.695381 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.699692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.699720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-config-data\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.702783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.703001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.703554 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.706822 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.721495 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.740272 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.740505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6rwb\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-kube-api-access-t6rwb\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.758614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.870433 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.920099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" event={"ID":"ef353dfc-8f35-483c-ad2a-2df34420381f","Type":"ContainerStarted","Data":"37081b51f2c4fdbde2e2eb7dc6e53a96f742c18d55e05c8034871041e6b1239e"} Oct 02 18:37:37 crc kubenswrapper[4909]: I1002 18:37:37.922173 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" event={"ID":"fed94911-9861-4b29-9f39-cfd3be4bc01a","Type":"ContainerStarted","Data":"93092a93669cb31fe82af1d9001235264d42c1a12f9b368359d758112c354613"} Oct 02 18:37:38 crc kubenswrapper[4909]: I1002 18:37:38.036160 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:37:38 crc kubenswrapper[4909]: W1002 18:37:38.125752 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd73c266b_a3db_431b_a40f_f0a5b9d06610.slice/crio-b5af4f85b08d4df859e6830fb07a7025e820902e0936214d7f745d7593c570ae WatchSource:0}: Error finding container b5af4f85b08d4df859e6830fb07a7025e820902e0936214d7f745d7593c570ae: Status 404 returned error can't find the container with id b5af4f85b08d4df859e6830fb07a7025e820902e0936214d7f745d7593c570ae Oct 02 18:37:38 crc kubenswrapper[4909]: I1002 18:37:38.531192 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:37:38 crc kubenswrapper[4909]: W1002 18:37:38.563285 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0396bfb_ab96_4eb9_af72_e3597ca74ca4.slice/crio-33584b1bdd64ae17404374ab0cf3a532de1f065a14d35bedd1e3afa6e69f48cf WatchSource:0}: Error finding container 33584b1bdd64ae17404374ab0cf3a532de1f065a14d35bedd1e3afa6e69f48cf: Status 404 returned error can't find the container with id 33584b1bdd64ae17404374ab0cf3a532de1f065a14d35bedd1e3afa6e69f48cf Oct 02 18:37:38 crc kubenswrapper[4909]: I1002 18:37:38.935661 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a0396bfb-ab96-4eb9-af72-e3597ca74ca4","Type":"ContainerStarted","Data":"33584b1bdd64ae17404374ab0cf3a532de1f065a14d35bedd1e3afa6e69f48cf"} Oct 02 18:37:38 crc kubenswrapper[4909]: I1002 18:37:38.940343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d73c266b-a3db-431b-a40f-f0a5b9d06610","Type":"ContainerStarted","Data":"b5af4f85b08d4df859e6830fb07a7025e820902e0936214d7f745d7593c570ae"} Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.117908 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.120189 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.124563 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5qgt4" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.124631 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.124793 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.124634 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.125365 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.135887 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.147320 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.231988 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-config-data-default\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232044 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7rr\" (UniqueName: \"kubernetes.io/projected/f308ff37-5d0f-4b6f-8b67-1ab86795e820-kube-api-access-qc7rr\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232070 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f308ff37-5d0f-4b6f-8b67-1ab86795e820-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232194 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232228 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-secrets\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-kolla-config\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232492 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.232517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.243833 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.253721 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.255713 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.256213 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nwhq4" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.256230 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.256724 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.258864 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-kolla-config\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-config-data-default\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7rr\" (UniqueName: \"kubernetes.io/projected/f308ff37-5d0f-4b6f-8b67-1ab86795e820-kube-api-access-qc7rr\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334652 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f308ff37-5d0f-4b6f-8b67-1ab86795e820-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334672 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.334712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-secrets\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.335855 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f308ff37-5d0f-4b6f-8b67-1ab86795e820-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.336238 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.337004 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.337251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-kolla-config\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.337861 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f308ff37-5d0f-4b6f-8b67-1ab86795e820-config-data-default\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.340939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-secrets\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.341140 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.352650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7rr\" (UniqueName: \"kubernetes.io/projected/f308ff37-5d0f-4b6f-8b67-1ab86795e820-kube-api-access-qc7rr\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.353644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f308ff37-5d0f-4b6f-8b67-1ab86795e820-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.360384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f308ff37-5d0f-4b6f-8b67-1ab86795e820\") " pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436306 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mq9\" (UniqueName: \"kubernetes.io/projected/8fa4c480-f836-44f0-b313-ac6cf9e97262-kube-api-access-s4mq9\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436328 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fa4c480-f836-44f0-b313-ac6cf9e97262-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436406 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.436455 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.452690 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.537954 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.537996 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538019 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mq9\" (UniqueName: \"kubernetes.io/projected/8fa4c480-f836-44f0-b313-ac6cf9e97262-kube-api-access-s4mq9\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538131 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538161 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fa4c480-f836-44f0-b313-ac6cf9e97262-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.538231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.539400 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.541212 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.542783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fa4c480-f836-44f0-b313-ac6cf9e97262-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.544456 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.544743 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.544829 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.545131 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa4c480-f836-44f0-b313-ac6cf9e97262-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.545172 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fa4c480-f836-44f0-b313-ac6cf9e97262-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.565146 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mq9\" (UniqueName: \"kubernetes.io/projected/8fa4c480-f836-44f0-b313-ac6cf9e97262-kube-api-access-s4mq9\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.610235 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.612902 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.617228 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-ndvrx" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.618092 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.618315 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.636437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8fa4c480-f836-44f0-b313-ac6cf9e97262\") " pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.654570 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.740988 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f77fb311-1680-4c6f-ac0c-70baa3e89b81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.741038 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77fb311-1680-4c6f-ac0c-70baa3e89b81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.741147 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f77fb311-1680-4c6f-ac0c-70baa3e89b81-config-data\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.741176 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7g72\" (UniqueName: \"kubernetes.io/projected/f77fb311-1680-4c6f-ac0c-70baa3e89b81-kube-api-access-g7g72\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.741221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f77fb311-1680-4c6f-ac0c-70baa3e89b81-kolla-config\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.842951 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f77fb311-1680-4c6f-ac0c-70baa3e89b81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.842995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77fb311-1680-4c6f-ac0c-70baa3e89b81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.843094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f77fb311-1680-4c6f-ac0c-70baa3e89b81-config-data\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.843118 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7g72\" (UniqueName: \"kubernetes.io/projected/f77fb311-1680-4c6f-ac0c-70baa3e89b81-kube-api-access-g7g72\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.843841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f77fb311-1680-4c6f-ac0c-70baa3e89b81-config-data\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.843904 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f77fb311-1680-4c6f-ac0c-70baa3e89b81-kolla-config\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.844402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f77fb311-1680-4c6f-ac0c-70baa3e89b81-kolla-config\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.846905 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77fb311-1680-4c6f-ac0c-70baa3e89b81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.847095 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f77fb311-1680-4c6f-ac0c-70baa3e89b81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.858678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7g72\" (UniqueName: \"kubernetes.io/projected/f77fb311-1680-4c6f-ac0c-70baa3e89b81-kube-api-access-g7g72\") pod \"memcached-0\" (UID: \"f77fb311-1680-4c6f-ac0c-70baa3e89b81\") " pod="openstack/memcached-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.883368 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 18:37:40 crc kubenswrapper[4909]: I1002 18:37:40.968910 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 18:37:42 crc kubenswrapper[4909]: I1002 18:37:42.893614 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:37:42 crc kubenswrapper[4909]: I1002 18:37:42.894989 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 18:37:42 crc kubenswrapper[4909]: I1002 18:37:42.898852 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2mvpc" Oct 02 18:37:42 crc kubenswrapper[4909]: I1002 18:37:42.920690 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:37:42 crc kubenswrapper[4909]: I1002 18:37:42.992961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49df\" (UniqueName: \"kubernetes.io/projected/fe1b0647-0d74-4e52-9793-46dc1fa7d405-kube-api-access-k49df\") pod \"kube-state-metrics-0\" (UID: \"fe1b0647-0d74-4e52-9793-46dc1fa7d405\") " pod="openstack/kube-state-metrics-0" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.101159 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49df\" (UniqueName: \"kubernetes.io/projected/fe1b0647-0d74-4e52-9793-46dc1fa7d405-kube-api-access-k49df\") pod \"kube-state-metrics-0\" (UID: \"fe1b0647-0d74-4e52-9793-46dc1fa7d405\") " pod="openstack/kube-state-metrics-0" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.136747 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49df\" (UniqueName: \"kubernetes.io/projected/fe1b0647-0d74-4e52-9793-46dc1fa7d405-kube-api-access-k49df\") pod \"kube-state-metrics-0\" (UID: \"fe1b0647-0d74-4e52-9793-46dc1fa7d405\") " pod="openstack/kube-state-metrics-0" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.239560 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.581221 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s"] Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.582260 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.590387 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-l9xs9" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.591479 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.600014 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s"] Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.717427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpjl4\" (UniqueName: \"kubernetes.io/projected/f89fa753-5d7e-4c80-8847-25eeaee0c3e3-kube-api-access-vpjl4\") pod \"observability-ui-dashboards-6584dc9448-rtc7s\" (UID: \"f89fa753-5d7e-4c80-8847-25eeaee0c3e3\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.717622 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f89fa753-5d7e-4c80-8847-25eeaee0c3e3-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-rtc7s\" (UID: \"f89fa753-5d7e-4c80-8847-25eeaee0c3e3\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.821361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f89fa753-5d7e-4c80-8847-25eeaee0c3e3-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-rtc7s\" (UID: \"f89fa753-5d7e-4c80-8847-25eeaee0c3e3\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.821470 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpjl4\" (UniqueName: \"kubernetes.io/projected/f89fa753-5d7e-4c80-8847-25eeaee0c3e3-kube-api-access-vpjl4\") pod \"observability-ui-dashboards-6584dc9448-rtc7s\" (UID: \"f89fa753-5d7e-4c80-8847-25eeaee0c3e3\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.834978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f89fa753-5d7e-4c80-8847-25eeaee0c3e3-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-rtc7s\" (UID: \"f89fa753-5d7e-4c80-8847-25eeaee0c3e3\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.863720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpjl4\" (UniqueName: \"kubernetes.io/projected/f89fa753-5d7e-4c80-8847-25eeaee0c3e3-kube-api-access-vpjl4\") pod \"observability-ui-dashboards-6584dc9448-rtc7s\" (UID: \"f89fa753-5d7e-4c80-8847-25eeaee0c3e3\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.916583 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.958638 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-678cfbdd98-xhl7w"] Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.959812 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:43 crc kubenswrapper[4909]: I1002 18:37:43.977641 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678cfbdd98-xhl7w"] Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135446 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-oauth-config\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135514 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-config\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2ds\" (UniqueName: \"kubernetes.io/projected/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-kube-api-access-mx2ds\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135604 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-service-ca\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-trusted-ca-bundle\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135656 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-serving-cert\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.135680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-oauth-serving-cert\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.168182 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.170410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.173744 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.174014 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.175006 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.175451 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.176523 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-27q5x" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.180002 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.181319 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.237309 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-service-ca\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.237361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-trusted-ca-bundle\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.237390 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-serving-cert\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.237418 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-oauth-serving-cert\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.238338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-service-ca\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.238497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-oauth-serving-cert\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.238928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-trusted-ca-bundle\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.240199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-oauth-config\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.240326 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-config\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.240453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2ds\" (UniqueName: \"kubernetes.io/projected/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-kube-api-access-mx2ds\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.240895 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-config\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.241866 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-serving-cert\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.245864 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-console-oauth-config\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.263426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2ds\" (UniqueName: \"kubernetes.io/projected/6d6581a2-6532-49e3-85a2-cc7cc0a5d77f-kube-api-access-mx2ds\") pod \"console-678cfbdd98-xhl7w\" (UID: \"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f\") " pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.284212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342134 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-config\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342187 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7485bfaa-555b-4469-9681-bc735a109726-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7485bfaa-555b-4469-9681-bc735a109726-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342311 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342506 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkvgm\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-kube-api-access-dkvgm\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342639 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.342723 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.444694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.444951 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7485bfaa-555b-4469-9681-bc735a109726-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.445014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7485bfaa-555b-4469-9681-bc735a109726-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.445049 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.445084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkvgm\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-kube-api-access-dkvgm\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.445120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.445153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.445183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-config\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.446745 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7485bfaa-555b-4469-9681-bc735a109726-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.448753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7485bfaa-555b-4469-9681-bc735a109726-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.448953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-config\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.453829 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.453859 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf86cfcf601ef491894e84c5443d83ba8dd6ded986121e5a4bb1909afcd734cb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.481114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.481122 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.486424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkvgm\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-kube-api-access-dkvgm\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.486764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.488601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:44 crc kubenswrapper[4909]: I1002 18:37:44.495839 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.210350 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sj5pf"] Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.212127 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.215082 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.215405 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ff9p4" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.216880 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.220574 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wz8jk"] Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.223603 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.235801 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sj5pf"] Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.243132 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wz8jk"] Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.380601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-combined-ca-bundle\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.380684 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-run\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.380740 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjmq\" (UniqueName: \"kubernetes.io/projected/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-kube-api-access-mhjmq\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.380851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-log\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.380962 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-scripts\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.380994 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-lib\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-scripts\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-log-ovn\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381323 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-run-ovn\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381371 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpm8j\" (UniqueName: \"kubernetes.io/projected/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-kube-api-access-wpm8j\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-run\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-ovn-controller-tls-certs\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.381565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-etc-ovs\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.483570 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-lib\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.483622 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-scripts\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.484271 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-lib\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.484365 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-log-ovn\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.485792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-scripts\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.483682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-log-ovn\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.485929 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-run-ovn\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.485953 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpm8j\" (UniqueName: \"kubernetes.io/projected/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-kube-api-access-wpm8j\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486106 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-run-ovn\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-run\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-ovn-controller-tls-certs\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-etc-ovs\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486355 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-var-run\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486634 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-etc-ovs\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486806 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-combined-ca-bundle\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486864 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-run\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486929 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjmq\" (UniqueName: \"kubernetes.io/projected/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-kube-api-access-mhjmq\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-log\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.486994 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-run\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.487012 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-scripts\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.487098 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-var-log\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.490769 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-combined-ca-bundle\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.497215 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-ovn-controller-tls-certs\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.498397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-scripts\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.506569 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpm8j\" (UniqueName: \"kubernetes.io/projected/01e45a1f-f70c-4e2f-94ed-763af4c5b5cb-kube-api-access-wpm8j\") pod \"ovn-controller-ovs-wz8jk\" (UID: \"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb\") " pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.506749 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjmq\" (UniqueName: \"kubernetes.io/projected/9f1ef01b-ae9b-4a58-a411-d2e2e5770742-kube-api-access-mhjmq\") pod \"ovn-controller-sj5pf\" (UID: \"9f1ef01b-ae9b-4a58-a411-d2e2e5770742\") " pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.542579 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sj5pf" Oct 02 18:37:46 crc kubenswrapper[4909]: I1002 18:37:46.553510 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.885478 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.887670 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.894439 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.894473 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.894478 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-87cz4" Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.894791 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.906882 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 18:37:47 crc kubenswrapper[4909]: I1002 18:37:47.908536 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9f9r\" (UniqueName: \"kubernetes.io/projected/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-kube-api-access-v9f9r\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020283 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020303 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020345 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020380 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020433 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020458 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.020474 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.121702 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.121771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.121801 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122094 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122285 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.122713 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9f9r\" (UniqueName: \"kubernetes.io/projected/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-kube-api-access-v9f9r\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.123487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.124231 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.128992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.141923 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.143100 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.144043 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9f9r\" (UniqueName: \"kubernetes.io/projected/2e32a55a-2e45-4365-a8cf-3002a0c0ba73-kube-api-access-v9f9r\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.144312 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e32a55a-2e45-4365-a8cf-3002a0c0ba73\") " pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:48 crc kubenswrapper[4909]: I1002 18:37:48.215294 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.120123 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.122632 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.127152 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ztw8d" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.127318 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.127348 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.171443 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.171985 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279119 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca7d1c-52c8-480e-975a-d22877f0971f-config\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72fj\" (UniqueName: \"kubernetes.io/projected/17ca7d1c-52c8-480e-975a-d22877f0971f-kube-api-access-s72fj\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279395 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17ca7d1c-52c8-480e-975a-d22877f0971f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17ca7d1c-52c8-480e-975a-d22877f0971f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279462 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.279557 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381047 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381133 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca7d1c-52c8-480e-975a-d22877f0971f-config\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381186 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72fj\" (UniqueName: \"kubernetes.io/projected/17ca7d1c-52c8-480e-975a-d22877f0971f-kube-api-access-s72fj\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17ca7d1c-52c8-480e-975a-d22877f0971f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17ca7d1c-52c8-480e-975a-d22877f0971f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381244 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381278 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381295 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.381581 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.382454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17ca7d1c-52c8-480e-975a-d22877f0971f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.383075 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca7d1c-52c8-480e-975a-d22877f0971f-config\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.383252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17ca7d1c-52c8-480e-975a-d22877f0971f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.387524 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.390762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.394411 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17ca7d1c-52c8-480e-975a-d22877f0971f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.398621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72fj\" (UniqueName: \"kubernetes.io/projected/17ca7d1c-52c8-480e-975a-d22877f0971f-kube-api-access-s72fj\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.414169 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17ca7d1c-52c8-480e-975a-d22877f0971f\") " pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:50 crc kubenswrapper[4909]: I1002 18:37:50.505813 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 18:37:53 crc kubenswrapper[4909]: I1002 18:37:53.054398 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:37:53 crc kubenswrapper[4909]: I1002 18:37:53.054649 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:37:53 crc kubenswrapper[4909]: I1002 18:37:53.054690 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:37:53 crc kubenswrapper[4909]: I1002 18:37:53.055521 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdce1121ba765dae4e3ecbe0b01dbaa7f404571af7f88aa30e85e68bfec50aa5"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:37:53 crc kubenswrapper[4909]: I1002 18:37:53.055570 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://cdce1121ba765dae4e3ecbe0b01dbaa7f404571af7f88aa30e85e68bfec50aa5" gracePeriod=600 Oct 02 18:37:54 crc kubenswrapper[4909]: I1002 18:37:54.246897 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="cdce1121ba765dae4e3ecbe0b01dbaa7f404571af7f88aa30e85e68bfec50aa5" exitCode=0 Oct 02 18:37:54 crc kubenswrapper[4909]: I1002 18:37:54.246989 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"cdce1121ba765dae4e3ecbe0b01dbaa7f404571af7f88aa30e85e68bfec50aa5"} Oct 02 18:37:54 crc kubenswrapper[4909]: I1002 18:37:54.247306 4909 scope.go:117] "RemoveContainer" containerID="573b7146b529d3438c92c471059130780d1a5678b93eb10b123e4e2ecdc9d9a5" Oct 02 18:37:54 crc kubenswrapper[4909]: E1002 18:37:54.944544 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 18:37:54 crc kubenswrapper[4909]: E1002 18:37:54.944691 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnt8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vd4m4_openstack(0c592261-2177-495a-93d7-f9086de5ec3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:37:54 crc kubenswrapper[4909]: E1002 18:37:54.945905 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" podUID="0c592261-2177-495a-93d7-f9086de5ec3b" Oct 02 18:37:54 crc kubenswrapper[4909]: E1002 18:37:54.946445 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 18:37:54 crc kubenswrapper[4909]: E1002 18:37:54.946590 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-258n8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-w9kmv_openstack(4db3b229-071d-42d7-81bf-44f3221e4cd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:37:54 crc kubenswrapper[4909]: E1002 18:37:54.947734 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" podUID="4db3b229-071d-42d7-81bf-44f3221e4cd1" Oct 02 18:37:55 crc kubenswrapper[4909]: E1002 18:37:55.259976 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" podUID="4db3b229-071d-42d7-81bf-44f3221e4cd1" Oct 02 18:37:56 crc kubenswrapper[4909]: E1002 18:37:56.180350 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 02 18:37:56 crc kubenswrapper[4909]: E1002 18:37:56.180977 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6rwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(a0396bfb-ab96-4eb9-af72-e3597ca74ca4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:37:56 crc kubenswrapper[4909]: E1002 18:37:56.182238 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" Oct 02 18:37:56 crc kubenswrapper[4909]: E1002 18:37:56.283201 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.410495 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.495797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c592261-2177-495a-93d7-f9086de5ec3b-config\") pod \"0c592261-2177-495a-93d7-f9086de5ec3b\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.496255 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/0c592261-2177-495a-93d7-f9086de5ec3b-kube-api-access-gnt8b\") pod \"0c592261-2177-495a-93d7-f9086de5ec3b\" (UID: \"0c592261-2177-495a-93d7-f9086de5ec3b\") " Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.497174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c592261-2177-495a-93d7-f9086de5ec3b-config" (OuterVolumeSpecName: "config") pod "0c592261-2177-495a-93d7-f9086de5ec3b" (UID: "0c592261-2177-495a-93d7-f9086de5ec3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.507708 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c592261-2177-495a-93d7-f9086de5ec3b-kube-api-access-gnt8b" (OuterVolumeSpecName: "kube-api-access-gnt8b") pod "0c592261-2177-495a-93d7-f9086de5ec3b" (UID: "0c592261-2177-495a-93d7-f9086de5ec3b"). InnerVolumeSpecName "kube-api-access-gnt8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.598127 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c592261-2177-495a-93d7-f9086de5ec3b-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:37:56 crc kubenswrapper[4909]: I1002 18:37:56.598157 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/0c592261-2177-495a-93d7-f9086de5ec3b-kube-api-access-gnt8b\") on node \"crc\" DevicePath \"\"" Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.016603 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.023420 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 18:37:57 crc kubenswrapper[4909]: W1002 18:37:57.075346 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf77fb311_1680_4c6f_ac0c_70baa3e89b81.slice/crio-0f63314722aa6e39e5b49ac0c2311ed68a6ba228e76be644b3d249838137b450 WatchSource:0}: Error finding container 0f63314722aa6e39e5b49ac0c2311ed68a6ba228e76be644b3d249838137b450: Status 404 returned error can't find the container with id 0f63314722aa6e39e5b49ac0c2311ed68a6ba228e76be644b3d249838137b450 Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.076365 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 18:37:57 crc kubenswrapper[4909]: W1002 18:37:57.283240 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7485bfaa_555b_4469_9681_bc735a109726.slice/crio-16734674d76ebd848942e08eb4571dab2bdce400f0ea3183c973a0bfcb062009 WatchSource:0}: Error finding container 16734674d76ebd848942e08eb4571dab2bdce400f0ea3183c973a0bfcb062009: Status 404 returned error can't find the container with id 16734674d76ebd848942e08eb4571dab2bdce400f0ea3183c973a0bfcb062009 Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.285353 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.302529 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.311451 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f308ff37-5d0f-4b6f-8b67-1ab86795e820","Type":"ContainerStarted","Data":"fb72b6757ac045b2359a70d5c5d50d611eb33f50d6334aceec7b06c68081f3ef"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.314574 4909 generic.go:334] "Generic (PLEG): container finished" podID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerID="d959cdb36cc3e4d4544fd4521ab4234f2a915f69ebcba54c4499da5c490b6432" exitCode=0 Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.314648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" event={"ID":"ef353dfc-8f35-483c-ad2a-2df34420381f","Type":"ContainerDied","Data":"d959cdb36cc3e4d4544fd4521ab4234f2a915f69ebcba54c4499da5c490b6432"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.326564 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"df159098c9d3023b7a6d6812151d5a2e4c4eebbf2424f0b96db2104b354ed569"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.365878 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f77fb311-1680-4c6f-ac0c-70baa3e89b81","Type":"ContainerStarted","Data":"0f63314722aa6e39e5b49ac0c2311ed68a6ba228e76be644b3d249838137b450"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.403708 4909 generic.go:334] "Generic (PLEG): container finished" podID="fed94911-9861-4b29-9f39-cfd3be4bc01a" containerID="678d458bf372e07e17056c8f3bf4c6850131c627bb843bb423afd13fd0e2c930" exitCode=0 Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.403791 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" event={"ID":"fed94911-9861-4b29-9f39-cfd3be4bc01a","Type":"ContainerDied","Data":"678d458bf372e07e17056c8f3bf4c6850131c627bb843bb423afd13fd0e2c930"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.454926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe1b0647-0d74-4e52-9793-46dc1fa7d405","Type":"ContainerStarted","Data":"3a85e42a5a1defaa56ef731a7a5b649cbea528897e4b6983804768b28075d310"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.495249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" event={"ID":"0c592261-2177-495a-93d7-f9086de5ec3b","Type":"ContainerDied","Data":"daa259990569ed481903e9b34efa111b7dde67cf192ab4795aa20f4e6dcfe096"} Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.495374 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vd4m4" Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.542112 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sj5pf"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.688846 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678cfbdd98-xhl7w"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.689075 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vd4m4"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.689086 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vd4m4"] Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.689099 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s"] Oct 02 18:37:57 crc kubenswrapper[4909]: W1002 18:37:57.808210 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89fa753_5d7e_4c80_8847_25eeaee0c3e3.slice/crio-3fd108ff7b570b1994fa66aa2b804a46b59caf4eb8309f4880edba5ad431c2c0 WatchSource:0}: Error finding container 3fd108ff7b570b1994fa66aa2b804a46b59caf4eb8309f4880edba5ad431c2c0: Status 404 returned error can't find the container with id 3fd108ff7b570b1994fa66aa2b804a46b59caf4eb8309f4880edba5ad431c2c0 Oct 02 18:37:57 crc kubenswrapper[4909]: I1002 18:37:57.983193 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.042683 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.130736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfgr9\" (UniqueName: \"kubernetes.io/projected/fed94911-9861-4b29-9f39-cfd3be4bc01a-kube-api-access-lfgr9\") pod \"fed94911-9861-4b29-9f39-cfd3be4bc01a\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.130838 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-config\") pod \"fed94911-9861-4b29-9f39-cfd3be4bc01a\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.130899 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-dns-svc\") pod \"fed94911-9861-4b29-9f39-cfd3be4bc01a\" (UID: \"fed94911-9861-4b29-9f39-cfd3be4bc01a\") " Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.136595 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed94911-9861-4b29-9f39-cfd3be4bc01a-kube-api-access-lfgr9" (OuterVolumeSpecName: "kube-api-access-lfgr9") pod "fed94911-9861-4b29-9f39-cfd3be4bc01a" (UID: "fed94911-9861-4b29-9f39-cfd3be4bc01a"). InnerVolumeSpecName "kube-api-access-lfgr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.152214 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fed94911-9861-4b29-9f39-cfd3be4bc01a" (UID: "fed94911-9861-4b29-9f39-cfd3be4bc01a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.154095 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-config" (OuterVolumeSpecName: "config") pod "fed94911-9861-4b29-9f39-cfd3be4bc01a" (UID: "fed94911-9861-4b29-9f39-cfd3be4bc01a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.233676 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.233859 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfgr9\" (UniqueName: \"kubernetes.io/projected/fed94911-9861-4b29-9f39-cfd3be4bc01a-kube-api-access-lfgr9\") on node \"crc\" DevicePath \"\"" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.233876 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed94911-9861-4b29-9f39-cfd3be4bc01a-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.509933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fa4c480-f836-44f0-b313-ac6cf9e97262","Type":"ContainerStarted","Data":"ef48a6136916dc21dc0db76659219e3e8c5298b37421a50a553a26894d2c38d2"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.513084 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2e32a55a-2e45-4365-a8cf-3002a0c0ba73","Type":"ContainerStarted","Data":"b5d3002f7756b929d58bfb095f8d166afa8fb6d9f74f0671b472822a6d6d0046"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.514503 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" event={"ID":"f89fa753-5d7e-4c80-8847-25eeaee0c3e3","Type":"ContainerStarted","Data":"3fd108ff7b570b1994fa66aa2b804a46b59caf4eb8309f4880edba5ad431c2c0"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.516492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d73c266b-a3db-431b-a40f-f0a5b9d06610","Type":"ContainerStarted","Data":"f418b687441436b3a07b048f6faffb535085d2fc78e0e54a8616c84206f2f045"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.517898 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" event={"ID":"fed94911-9861-4b29-9f39-cfd3be4bc01a","Type":"ContainerDied","Data":"93092a93669cb31fe82af1d9001235264d42c1a12f9b368359d758112c354613"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.517944 4909 scope.go:117] "RemoveContainer" containerID="678d458bf372e07e17056c8f3bf4c6850131c627bb843bb423afd13fd0e2c930" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.518011 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gqv78" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.521794 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678cfbdd98-xhl7w" event={"ID":"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f","Type":"ContainerStarted","Data":"e5af0cb86f66ccbd6198210dfd73e2b77a6403b82d52403ce8d110330c6cd208"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.521839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678cfbdd98-xhl7w" event={"ID":"6d6581a2-6532-49e3-85a2-cc7cc0a5d77f","Type":"ContainerStarted","Data":"26ee1219891ce4db5d9e3877f6d790632db50efbd9957dbcdebb2978ff7a5dad"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.526197 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" event={"ID":"ef353dfc-8f35-483c-ad2a-2df34420381f","Type":"ContainerStarted","Data":"716e80106b4a3c6ae0c559b68492ff4badbab0591931bc77196fc713177cf1e0"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.527142 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.529124 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerStarted","Data":"16734674d76ebd848942e08eb4571dab2bdce400f0ea3183c973a0bfcb062009"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.532383 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sj5pf" event={"ID":"9f1ef01b-ae9b-4a58-a411-d2e2e5770742","Type":"ContainerStarted","Data":"f8fff94fd3f9b85c301914ec22c49e0556179a2a4e54745a976b03aa73162481"} Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.586488 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" podStartSLOduration=3.427951515 podStartE2EDuration="22.586468444s" podCreationTimestamp="2025-10-02 18:37:36 +0000 UTC" firstStartedPulling="2025-10-02 18:37:37.202594701 +0000 UTC m=+1178.390090560" lastFinishedPulling="2025-10-02 18:37:56.36111163 +0000 UTC m=+1197.548607489" observedRunningTime="2025-10-02 18:37:58.57703433 +0000 UTC m=+1199.764530209" watchObservedRunningTime="2025-10-02 18:37:58.586468444 +0000 UTC m=+1199.773964303" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.601921 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-678cfbdd98-xhl7w" podStartSLOduration=15.601900405 podStartE2EDuration="15.601900405s" podCreationTimestamp="2025-10-02 18:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:37:58.593520893 +0000 UTC m=+1199.781016782" watchObservedRunningTime="2025-10-02 18:37:58.601900405 +0000 UTC m=+1199.789396264" Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.617283 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.651681 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gqv78"] Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.661427 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gqv78"] Oct 02 18:37:58 crc kubenswrapper[4909]: I1002 18:37:58.989910 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wz8jk"] Oct 02 18:37:59 crc kubenswrapper[4909]: W1002 18:37:59.241291 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e45a1f_f70c_4e2f_94ed_763af4c5b5cb.slice/crio-8a68eebfba02ee7eae4476229a3a5d6f508efb8f0a5f3d7cb25dfc25f10a9b0f WatchSource:0}: Error finding container 8a68eebfba02ee7eae4476229a3a5d6f508efb8f0a5f3d7cb25dfc25f10a9b0f: Status 404 returned error can't find the container with id 8a68eebfba02ee7eae4476229a3a5d6f508efb8f0a5f3d7cb25dfc25f10a9b0f Oct 02 18:37:59 crc kubenswrapper[4909]: I1002 18:37:59.544604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"17ca7d1c-52c8-480e-975a-d22877f0971f","Type":"ContainerStarted","Data":"27c4635e1d7a8dbf71b5fd69b55bc8db400a457de4115b8cf41187aa436f2248"} Oct 02 18:37:59 crc kubenswrapper[4909]: I1002 18:37:59.551231 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wz8jk" event={"ID":"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb","Type":"ContainerStarted","Data":"8a68eebfba02ee7eae4476229a3a5d6f508efb8f0a5f3d7cb25dfc25f10a9b0f"} Oct 02 18:37:59 crc kubenswrapper[4909]: I1002 18:37:59.634107 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c592261-2177-495a-93d7-f9086de5ec3b" path="/var/lib/kubelet/pods/0c592261-2177-495a-93d7-f9086de5ec3b/volumes" Oct 02 18:37:59 crc kubenswrapper[4909]: I1002 18:37:59.634524 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed94911-9861-4b29-9f39-cfd3be4bc01a" path="/var/lib/kubelet/pods/fed94911-9861-4b29-9f39-cfd3be4bc01a/volumes" Oct 02 18:38:04 crc kubenswrapper[4909]: I1002 18:38:04.285080 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:38:04 crc kubenswrapper[4909]: I1002 18:38:04.285695 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:38:04 crc kubenswrapper[4909]: I1002 18:38:04.294723 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:38:04 crc kubenswrapper[4909]: I1002 18:38:04.593972 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-678cfbdd98-xhl7w" Oct 02 18:38:04 crc kubenswrapper[4909]: I1002 18:38:04.651070 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b6c9598fb-p2t8g"] Oct 02 18:38:06 crc kubenswrapper[4909]: I1002 18:38:06.720189 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:38:06 crc kubenswrapper[4909]: I1002 18:38:06.782200 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9kmv"] Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.500075 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.612072 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-258n8\" (UniqueName: \"kubernetes.io/projected/4db3b229-071d-42d7-81bf-44f3221e4cd1-kube-api-access-258n8\") pod \"4db3b229-071d-42d7-81bf-44f3221e4cd1\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.612227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-dns-svc\") pod \"4db3b229-071d-42d7-81bf-44f3221e4cd1\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.612370 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-config\") pod \"4db3b229-071d-42d7-81bf-44f3221e4cd1\" (UID: \"4db3b229-071d-42d7-81bf-44f3221e4cd1\") " Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.613749 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4db3b229-071d-42d7-81bf-44f3221e4cd1" (UID: "4db3b229-071d-42d7-81bf-44f3221e4cd1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.614526 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-config" (OuterVolumeSpecName: "config") pod "4db3b229-071d-42d7-81bf-44f3221e4cd1" (UID: "4db3b229-071d-42d7-81bf-44f3221e4cd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.622599 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.624507 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db3b229-071d-42d7-81bf-44f3221e4cd1-kube-api-access-258n8" (OuterVolumeSpecName: "kube-api-access-258n8") pod "4db3b229-071d-42d7-81bf-44f3221e4cd1" (UID: "4db3b229-071d-42d7-81bf-44f3221e4cd1"). InnerVolumeSpecName "kube-api-access-258n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.714594 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.714823 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db3b229-071d-42d7-81bf-44f3221e4cd1-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.714841 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-258n8\" (UniqueName: \"kubernetes.io/projected/4db3b229-071d-42d7-81bf-44f3221e4cd1-kube-api-access-258n8\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.728999 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w9kmv" event={"ID":"4db3b229-071d-42d7-81bf-44f3221e4cd1","Type":"ContainerDied","Data":"82091f3f9b14741f3989017fbe9e2d00dd5dd68081c1c48b9ebc23cf72a457bd"} Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.978490 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9kmv"] Oct 02 18:38:07 crc kubenswrapper[4909]: I1002 18:38:07.986656 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9kmv"] Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.629180 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db3b229-071d-42d7-81bf-44f3221e4cd1" path="/var/lib/kubelet/pods/4db3b229-071d-42d7-81bf-44f3221e4cd1/volumes" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.642428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe1b0647-0d74-4e52-9793-46dc1fa7d405","Type":"ContainerStarted","Data":"5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.642940 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.645928 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2e32a55a-2e45-4365-a8cf-3002a0c0ba73","Type":"ContainerStarted","Data":"3f5c93e1174b0c8994483244a0aead3316dd33f66415e3c85bebab6b2f94b62c"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.648119 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"17ca7d1c-52c8-480e-975a-d22877f0971f","Type":"ContainerStarted","Data":"a13530d07a2b27e7b7c8d801aa50b046d2757d88034014bbeef451e4eae312fb"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.652115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sj5pf" event={"ID":"9f1ef01b-ae9b-4a58-a411-d2e2e5770742","Type":"ContainerStarted","Data":"e7b571a405b28bf085f0a6d8d06a6fc0ceaa0c27be01bf61922a282b829432bf"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.652178 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sj5pf" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.654847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f77fb311-1680-4c6f-ac0c-70baa3e89b81","Type":"ContainerStarted","Data":"2233bd89189e18ebd33089a2014537f1c5197339cd24bf1f1c26d78707807a48"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.654932 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.656880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fa4c480-f836-44f0-b313-ac6cf9e97262","Type":"ContainerStarted","Data":"04c465dc006be440e16a4f7452c47cec78deb1c33e2b56d7c2a9529aed48d032"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.658879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f308ff37-5d0f-4b6f-8b67-1ab86795e820","Type":"ContainerStarted","Data":"4e15d44df4a97531aaab937f2c40bba8a35c8acdcc37542eb0a06872287b0d73"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.661492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" event={"ID":"f89fa753-5d7e-4c80-8847-25eeaee0c3e3","Type":"ContainerStarted","Data":"456c59b640013d02120dd8b61ecf465e00d691f57d94059319939fd85c660c6d"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.664067 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wz8jk" event={"ID":"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb","Type":"ContainerStarted","Data":"be664efec00ed1df88732752fe19a975d69c5ab81fe2e0e26bd11647756d5e10"} Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.670484 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.245850144 podStartE2EDuration="27.670464112s" podCreationTimestamp="2025-10-02 18:37:42 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.021160201 +0000 UTC m=+1198.208656060" lastFinishedPulling="2025-10-02 18:38:08.445774179 +0000 UTC m=+1209.633270028" observedRunningTime="2025-10-02 18:38:09.667614173 +0000 UTC m=+1210.855110042" watchObservedRunningTime="2025-10-02 18:38:09.670464112 +0000 UTC m=+1210.857959981" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.727594 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sj5pf" podStartSLOduration=12.974271183 podStartE2EDuration="23.727576939s" podCreationTimestamp="2025-10-02 18:37:46 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.615993313 +0000 UTC m=+1198.803489162" lastFinishedPulling="2025-10-02 18:38:08.369299059 +0000 UTC m=+1209.556794918" observedRunningTime="2025-10-02 18:38:09.718377063 +0000 UTC m=+1210.905872932" watchObservedRunningTime="2025-10-02 18:38:09.727576939 +0000 UTC m=+1210.915072798" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.756212 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-6584dc9448-rtc7s" podStartSLOduration=16.79328985 podStartE2EDuration="26.75619368s" podCreationTimestamp="2025-10-02 18:37:43 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.810552827 +0000 UTC m=+1198.998048686" lastFinishedPulling="2025-10-02 18:38:07.773456647 +0000 UTC m=+1208.960952516" observedRunningTime="2025-10-02 18:38:09.754081854 +0000 UTC m=+1210.941577723" watchObservedRunningTime="2025-10-02 18:38:09.75619368 +0000 UTC m=+1210.943689549" Oct 02 18:38:09 crc kubenswrapper[4909]: I1002 18:38:09.796090 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.097794686 podStartE2EDuration="29.79607385s" podCreationTimestamp="2025-10-02 18:37:40 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.077392782 +0000 UTC m=+1198.264888641" lastFinishedPulling="2025-10-02 18:38:05.775671906 +0000 UTC m=+1206.963167805" observedRunningTime="2025-10-02 18:38:09.788485075 +0000 UTC m=+1210.975980934" watchObservedRunningTime="2025-10-02 18:38:09.79607385 +0000 UTC m=+1210.983569709" Oct 02 18:38:10 crc kubenswrapper[4909]: I1002 18:38:10.675778 4909 generic.go:334] "Generic (PLEG): container finished" podID="01e45a1f-f70c-4e2f-94ed-763af4c5b5cb" containerID="be664efec00ed1df88732752fe19a975d69c5ab81fe2e0e26bd11647756d5e10" exitCode=0 Oct 02 18:38:10 crc kubenswrapper[4909]: I1002 18:38:10.675840 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wz8jk" event={"ID":"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb","Type":"ContainerDied","Data":"be664efec00ed1df88732752fe19a975d69c5ab81fe2e0e26bd11647756d5e10"} Oct 02 18:38:11 crc kubenswrapper[4909]: I1002 18:38:11.725476 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wz8jk" event={"ID":"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb","Type":"ContainerStarted","Data":"f0a029ad1ad79ba7a0e4da6bd208dae0bd63b1ba692a21ba8e6dbfc29dd6bb36"} Oct 02 18:38:12 crc kubenswrapper[4909]: I1002 18:38:12.735072 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerStarted","Data":"38d1261e38e7874de28fc5e1793400c1aa1f1875912aac6c9c8c3a3976295b43"} Oct 02 18:38:12 crc kubenswrapper[4909]: I1002 18:38:12.738207 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wz8jk" event={"ID":"01e45a1f-f70c-4e2f-94ed-763af4c5b5cb","Type":"ContainerStarted","Data":"689ab847d39fc76f7c28da7d8f20e76a57ad6285b185c39607f61a542716215e"} Oct 02 18:38:12 crc kubenswrapper[4909]: I1002 18:38:12.738462 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:38:12 crc kubenswrapper[4909]: I1002 18:38:12.787880 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wz8jk" podStartSLOduration=18.503798014 podStartE2EDuration="26.787859427s" podCreationTimestamp="2025-10-02 18:37:46 +0000 UTC" firstStartedPulling="2025-10-02 18:37:59.247226817 +0000 UTC m=+1200.434722676" lastFinishedPulling="2025-10-02 18:38:07.53128822 +0000 UTC m=+1208.718784089" observedRunningTime="2025-10-02 18:38:12.781970993 +0000 UTC m=+1213.969466852" watchObservedRunningTime="2025-10-02 18:38:12.787859427 +0000 UTC m=+1213.975355306" Oct 02 18:38:13 crc kubenswrapper[4909]: I1002 18:38:13.755239 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a0396bfb-ab96-4eb9-af72-e3597ca74ca4","Type":"ContainerStarted","Data":"ba73d188de58df3dffa8c592fb175de9e46bf125cd7abfcd3881fb87ceadff99"} Oct 02 18:38:13 crc kubenswrapper[4909]: I1002 18:38:13.756497 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:38:15 crc kubenswrapper[4909]: I1002 18:38:15.777653 4909 generic.go:334] "Generic (PLEG): container finished" podID="8fa4c480-f836-44f0-b313-ac6cf9e97262" containerID="04c465dc006be440e16a4f7452c47cec78deb1c33e2b56d7c2a9529aed48d032" exitCode=0 Oct 02 18:38:15 crc kubenswrapper[4909]: I1002 18:38:15.777924 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fa4c480-f836-44f0-b313-ac6cf9e97262","Type":"ContainerDied","Data":"04c465dc006be440e16a4f7452c47cec78deb1c33e2b56d7c2a9529aed48d032"} Oct 02 18:38:15 crc kubenswrapper[4909]: I1002 18:38:15.786645 4909 generic.go:334] "Generic (PLEG): container finished" podID="f308ff37-5d0f-4b6f-8b67-1ab86795e820" containerID="4e15d44df4a97531aaab937f2c40bba8a35c8acdcc37542eb0a06872287b0d73" exitCode=0 Oct 02 18:38:15 crc kubenswrapper[4909]: I1002 18:38:15.786727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f308ff37-5d0f-4b6f-8b67-1ab86795e820","Type":"ContainerDied","Data":"4e15d44df4a97531aaab937f2c40bba8a35c8acdcc37542eb0a06872287b0d73"} Oct 02 18:38:15 crc kubenswrapper[4909]: I1002 18:38:15.970174 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.808611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f308ff37-5d0f-4b6f-8b67-1ab86795e820","Type":"ContainerStarted","Data":"ba1e0728cb857ba0fba8b53e7f6f1f5fa84cbcd1b94eafff944fa0feadc1ee07"} Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.811795 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"17ca7d1c-52c8-480e-975a-d22877f0971f","Type":"ContainerStarted","Data":"7eec4085803f8d76066e56b5cce79fc7947adbb8895dcab6bb9836a85c231b3e"} Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.815866 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fa4c480-f836-44f0-b313-ac6cf9e97262","Type":"ContainerStarted","Data":"d71c725ab4f2d97e3f4d28e77fa8baf8d9f69f638b3c3257f8a14f39964a4aec"} Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.818976 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2e32a55a-2e45-4365-a8cf-3002a0c0ba73","Type":"ContainerStarted","Data":"c156b6fab93b041ca0a45e9900ba834ec0df83b18ea96526f7e77dce1d5f8caa"} Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.842531 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.755172226 podStartE2EDuration="38.842506798s" podCreationTimestamp="2025-10-02 18:37:39 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.039153161 +0000 UTC m=+1198.226649020" lastFinishedPulling="2025-10-02 18:38:07.126487733 +0000 UTC m=+1208.313983592" observedRunningTime="2025-10-02 18:38:17.837221604 +0000 UTC m=+1219.024717473" watchObservedRunningTime="2025-10-02 18:38:17.842506798 +0000 UTC m=+1219.030002677" Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.875658 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.970470285 podStartE2EDuration="28.875637419s" podCreationTimestamp="2025-10-02 18:37:49 +0000 UTC" firstStartedPulling="2025-10-02 18:37:58.702731173 +0000 UTC m=+1199.890227032" lastFinishedPulling="2025-10-02 18:38:16.607898307 +0000 UTC m=+1217.795394166" observedRunningTime="2025-10-02 18:38:17.867055643 +0000 UTC m=+1219.054551512" watchObservedRunningTime="2025-10-02 18:38:17.875637419 +0000 UTC m=+1219.063133278" Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.896081 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.841720680999998 podStartE2EDuration="38.896063006s" podCreationTimestamp="2025-10-02 18:37:39 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.315215392 +0000 UTC m=+1198.502711251" lastFinishedPulling="2025-10-02 18:38:08.369557717 +0000 UTC m=+1209.557053576" observedRunningTime="2025-10-02 18:38:17.895373984 +0000 UTC m=+1219.082869873" watchObservedRunningTime="2025-10-02 18:38:17.896063006 +0000 UTC m=+1219.083558865" Oct 02 18:38:17 crc kubenswrapper[4909]: I1002 18:38:17.931492 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.339457888 podStartE2EDuration="31.931470368s" podCreationTimestamp="2025-10-02 18:37:46 +0000 UTC" firstStartedPulling="2025-10-02 18:37:58.001339655 +0000 UTC m=+1199.188835514" lastFinishedPulling="2025-10-02 18:38:16.593352135 +0000 UTC m=+1217.780847994" observedRunningTime="2025-10-02 18:38:17.921780115 +0000 UTC m=+1219.109275984" watchObservedRunningTime="2025-10-02 18:38:17.931470368 +0000 UTC m=+1219.118966237" Oct 02 18:38:18 crc kubenswrapper[4909]: I1002 18:38:18.216316 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 18:38:18 crc kubenswrapper[4909]: I1002 18:38:18.216766 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 18:38:18 crc kubenswrapper[4909]: I1002 18:38:18.283960 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 18:38:18 crc kubenswrapper[4909]: I1002 18:38:18.871199 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.129441 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nwzhr"] Oct 02 18:38:19 crc kubenswrapper[4909]: E1002 18:38:19.129928 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed94911-9861-4b29-9f39-cfd3be4bc01a" containerName="init" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.129954 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed94911-9861-4b29-9f39-cfd3be4bc01a" containerName="init" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.130268 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed94911-9861-4b29-9f39-cfd3be4bc01a" containerName="init" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.131585 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.133353 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.153355 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nwzhr"] Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.232122 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bzwss"] Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.233211 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.237577 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.248789 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.248872 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.248909 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wpj\" (UniqueName: \"kubernetes.io/projected/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-kube-api-access-26wpj\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.248933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-config\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.252161 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bzwss"] Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.350661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.350754 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b3890e7f-68df-442e-be6c-d6c69309c66d-ovn-rundir\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.351595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.351637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3890e7f-68df-442e-be6c-d6c69309c66d-config\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.351692 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.351719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890e7f-68df-442e-be6c-d6c69309c66d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.352252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.352318 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wpj\" (UniqueName: \"kubernetes.io/projected/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-kube-api-access-26wpj\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.352349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-config\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.352384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfpg\" (UniqueName: \"kubernetes.io/projected/b3890e7f-68df-442e-be6c-d6c69309c66d-kube-api-access-srfpg\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.352401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b3890e7f-68df-442e-be6c-d6c69309c66d-ovs-rundir\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.352456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3890e7f-68df-442e-be6c-d6c69309c66d-combined-ca-bundle\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.353244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-config\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.373300 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wpj\" (UniqueName: \"kubernetes.io/projected/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-kube-api-access-26wpj\") pod \"dnsmasq-dns-7fd796d7df-nwzhr\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.448740 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nwzhr"] Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.449336 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.454396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890e7f-68df-442e-be6c-d6c69309c66d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.454797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srfpg\" (UniqueName: \"kubernetes.io/projected/b3890e7f-68df-442e-be6c-d6c69309c66d-kube-api-access-srfpg\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.454837 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b3890e7f-68df-442e-be6c-d6c69309c66d-ovs-rundir\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.454902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3890e7f-68df-442e-be6c-d6c69309c66d-combined-ca-bundle\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.455016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b3890e7f-68df-442e-be6c-d6c69309c66d-ovn-rundir\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.455088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3890e7f-68df-442e-be6c-d6c69309c66d-config\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.455235 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b3890e7f-68df-442e-be6c-d6c69309c66d-ovs-rundir\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.455307 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b3890e7f-68df-442e-be6c-d6c69309c66d-ovn-rundir\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.455915 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3890e7f-68df-442e-be6c-d6c69309c66d-config\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.458119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890e7f-68df-442e-be6c-d6c69309c66d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.459607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3890e7f-68df-442e-be6c-d6c69309c66d-combined-ca-bundle\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.471338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srfpg\" (UniqueName: \"kubernetes.io/projected/b3890e7f-68df-442e-be6c-d6c69309c66d-kube-api-access-srfpg\") pod \"ovn-controller-metrics-bzwss\" (UID: \"b3890e7f-68df-442e-be6c-d6c69309c66d\") " pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.471369 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-n94sx"] Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.478880 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.483445 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.485200 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-n94sx"] Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.548826 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bzwss" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.556282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.556346 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.556465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-config\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.556516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.556538 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zs7s\" (UniqueName: \"kubernetes.io/projected/d67393e0-ece8-4934-bc68-2c5f4be3e92b-kube-api-access-6zs7s\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.658240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-config\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.658312 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.658333 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zs7s\" (UniqueName: \"kubernetes.io/projected/d67393e0-ece8-4934-bc68-2c5f4be3e92b-kube-api-access-6zs7s\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.658359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.658405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.659521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.659614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.659972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-config\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.661865 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.685170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zs7s\" (UniqueName: \"kubernetes.io/projected/d67393e0-ece8-4934-bc68-2c5f4be3e92b-kube-api-access-6zs7s\") pod \"dnsmasq-dns-86db49b7ff-n94sx\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.839805 4909 generic.go:334] "Generic (PLEG): container finished" podID="7485bfaa-555b-4469-9681-bc735a109726" containerID="38d1261e38e7874de28fc5e1793400c1aa1f1875912aac6c9c8c3a3976295b43" exitCode=0 Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.840118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerDied","Data":"38d1261e38e7874de28fc5e1793400c1aa1f1875912aac6c9c8c3a3976295b43"} Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.847126 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:19 crc kubenswrapper[4909]: I1002 18:38:19.998439 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nwzhr"] Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.128228 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bzwss"] Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.453465 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.453971 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.498067 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-n94sx"] Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.507010 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.507074 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.560962 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 18:38:20 crc kubenswrapper[4909]: E1002 18:38:20.802796 4909 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:33904->38.102.83.129:45429: write tcp 38.102.83.129:33904->38.102.83.129:45429: write: broken pipe Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.848991 4909 generic.go:334] "Generic (PLEG): container finished" podID="ebea1da7-7dd1-42e4-8e02-e332cb916fd9" containerID="a86d0c9eb4336bd13f452b9ca4de464209fbf76d76b9f6148f9fea67b5e0fc07" exitCode=0 Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.849049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" event={"ID":"ebea1da7-7dd1-42e4-8e02-e332cb916fd9","Type":"ContainerDied","Data":"a86d0c9eb4336bd13f452b9ca4de464209fbf76d76b9f6148f9fea67b5e0fc07"} Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.849108 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" event={"ID":"ebea1da7-7dd1-42e4-8e02-e332cb916fd9","Type":"ContainerStarted","Data":"497f4861f9d50e45fc2c150a690eecc910ec9c4b178181a5d78b1913b9eff520"} Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.852684 4909 generic.go:334] "Generic (PLEG): container finished" podID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerID="893bf2dadbbb04a2389b14d7f7691ac40f06774e70463e550974d8c417b061ac" exitCode=0 Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.852774 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" event={"ID":"d67393e0-ece8-4934-bc68-2c5f4be3e92b","Type":"ContainerDied","Data":"893bf2dadbbb04a2389b14d7f7691ac40f06774e70463e550974d8c417b061ac"} Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.852814 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" event={"ID":"d67393e0-ece8-4934-bc68-2c5f4be3e92b","Type":"ContainerStarted","Data":"a2b798576f38613e1487054f9ff96a5b2b57c2f4c842372287a5ce3fad13783e"} Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.854814 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bzwss" event={"ID":"b3890e7f-68df-442e-be6c-d6c69309c66d","Type":"ContainerStarted","Data":"7f9fd2ac8d1cfa567a92c7be99474fd24559a9f707d5acecdb76073e9d85a037"} Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.854867 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bzwss" event={"ID":"b3890e7f-68df-442e-be6c-d6c69309c66d","Type":"ContainerStarted","Data":"7e30f2e75e64460385cbee815dd1e2fa64623acdf31694254b63f9862ff9ef8e"} Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.883866 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.883912 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.891902 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bzwss" podStartSLOduration=1.891864946 podStartE2EDuration="1.891864946s" podCreationTimestamp="2025-10-02 18:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:38:20.885394164 +0000 UTC m=+1222.072890023" watchObservedRunningTime="2025-10-02 18:38:20.891864946 +0000 UTC m=+1222.079360805" Oct 02 18:38:20 crc kubenswrapper[4909]: I1002 18:38:20.932378 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.189556 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.236467 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 18:38:21 crc kubenswrapper[4909]: E1002 18:38:21.236813 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebea1da7-7dd1-42e4-8e02-e332cb916fd9" containerName="init" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.236828 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebea1da7-7dd1-42e4-8e02-e332cb916fd9" containerName="init" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.237061 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebea1da7-7dd1-42e4-8e02-e332cb916fd9" containerName="init" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.238959 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.243735 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ps29k" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.244014 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.244165 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.244267 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.256099 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.294726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26wpj\" (UniqueName: \"kubernetes.io/projected/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-kube-api-access-26wpj\") pod \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.294877 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-ovsdbserver-nb\") pod \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.294919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-dns-svc\") pod \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-config\") pod \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\" (UID: \"ebea1da7-7dd1-42e4-8e02-e332cb916fd9\") " Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295895 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-scripts\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-config\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.295993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.296019 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ps7j\" (UniqueName: \"kubernetes.io/projected/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-kube-api-access-4ps7j\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.298819 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-kube-api-access-26wpj" (OuterVolumeSpecName: "kube-api-access-26wpj") pod "ebea1da7-7dd1-42e4-8e02-e332cb916fd9" (UID: "ebea1da7-7dd1-42e4-8e02-e332cb916fd9"). InnerVolumeSpecName "kube-api-access-26wpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.328887 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebea1da7-7dd1-42e4-8e02-e332cb916fd9" (UID: "ebea1da7-7dd1-42e4-8e02-e332cb916fd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.329081 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebea1da7-7dd1-42e4-8e02-e332cb916fd9" (UID: "ebea1da7-7dd1-42e4-8e02-e332cb916fd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.336094 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-config" (OuterVolumeSpecName: "config") pod "ebea1da7-7dd1-42e4-8e02-e332cb916fd9" (UID: "ebea1da7-7dd1-42e4-8e02-e332cb916fd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397162 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ps7j\" (UniqueName: \"kubernetes.io/projected/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-kube-api-access-4ps7j\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397223 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397397 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-scripts\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-config\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397477 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397538 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26wpj\" (UniqueName: \"kubernetes.io/projected/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-kube-api-access-26wpj\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397552 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397566 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.397579 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebea1da7-7dd1-42e4-8e02-e332cb916fd9-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.401053 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-config\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.401275 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.401369 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-scripts\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.401584 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.403489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.403788 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.417890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ps7j\" (UniqueName: \"kubernetes.io/projected/5db23d92-24c2-4ee0-a489-adbb1c9cc04e-kube-api-access-4ps7j\") pod \"ovn-northd-0\" (UID: \"5db23d92-24c2-4ee0-a489-adbb1c9cc04e\") " pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.577086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.866548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" event={"ID":"ebea1da7-7dd1-42e4-8e02-e332cb916fd9","Type":"ContainerDied","Data":"497f4861f9d50e45fc2c150a690eecc910ec9c4b178181a5d78b1913b9eff520"} Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.866804 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nwzhr" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.866814 4909 scope.go:117] "RemoveContainer" containerID="a86d0c9eb4336bd13f452b9ca4de464209fbf76d76b9f6148f9fea67b5e0fc07" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.872877 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" event={"ID":"d67393e0-ece8-4934-bc68-2c5f4be3e92b","Type":"ContainerStarted","Data":"e6dbc46934f79aad30a6ac92846f0a879b3a50e1e2055419d8bba58597942a31"} Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.873702 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.937046 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nwzhr"] Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.948225 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nwzhr"] Oct 02 18:38:21 crc kubenswrapper[4909]: I1002 18:38:21.950635 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" podStartSLOduration=2.950611104 podStartE2EDuration="2.950611104s" podCreationTimestamp="2025-10-02 18:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:38:21.93697629 +0000 UTC m=+1223.124472159" watchObservedRunningTime="2025-10-02 18:38:21.950611104 +0000 UTC m=+1223.138106963" Oct 02 18:38:22 crc kubenswrapper[4909]: I1002 18:38:22.065437 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 18:38:22 crc kubenswrapper[4909]: I1002 18:38:22.892228 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5db23d92-24c2-4ee0-a489-adbb1c9cc04e","Type":"ContainerStarted","Data":"c97daedb76242a0a1942605bf6ac13ffc1f1858c7a1c3bfd9ea9ae4d3cccd81c"} Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.051182 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.135630 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.152043 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-n94sx"] Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.191701 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-j28mh"] Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.193727 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.199774 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j28mh"] Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.265986 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.340613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-dns-svc\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.340667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-config\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.340729 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.340936 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.341098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f56fh\" (UniqueName: \"kubernetes.io/projected/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-kube-api-access-f56fh\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.443077 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.443226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.443262 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f56fh\" (UniqueName: \"kubernetes.io/projected/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-kube-api-access-f56fh\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.443335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-dns-svc\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.443385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-config\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.443991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.444419 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-config\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.444446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-dns-svc\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.444572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.461620 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f56fh\" (UniqueName: \"kubernetes.io/projected/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-kube-api-access-f56fh\") pod \"dnsmasq-dns-698758b865-j28mh\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.528913 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:23 crc kubenswrapper[4909]: I1002 18:38:23.619613 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebea1da7-7dd1-42e4-8e02-e332cb916fd9" path="/var/lib/kubelet/pods/ebea1da7-7dd1-42e4-8e02-e332cb916fd9/volumes" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.034084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j28mh"] Oct 02 18:38:24 crc kubenswrapper[4909]: W1002 18:38:24.041170 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbea14e5_7c6a_45e8_9f93_598e3fdfa9b1.slice/crio-83dac0a0f2e20cdfba174a39cc6840d921fadade571bd57ae2f829139276b58e WatchSource:0}: Error finding container 83dac0a0f2e20cdfba174a39cc6840d921fadade571bd57ae2f829139276b58e: Status 404 returned error can't find the container with id 83dac0a0f2e20cdfba174a39cc6840d921fadade571bd57ae2f829139276b58e Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.374220 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.379510 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.381642 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.381664 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.381854 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ww9j7" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.381895 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.402058 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.477186 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.477253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-lock\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.477286 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rg9k\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-kube-api-access-4rg9k\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.477469 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.477518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-cache\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.578905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.578956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-cache\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.578991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.579019 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-lock\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.579061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rg9k\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-kube-api-access-4rg9k\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: E1002 18:38:24.579416 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 18:38:24 crc kubenswrapper[4909]: E1002 18:38:24.579462 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.579502 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: E1002 18:38:24.579532 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift podName:bc25944b-75d4-4e4e-b1de-57794b5c4bcf nodeName:}" failed. No retries permitted until 2025-10-02 18:38:25.079509016 +0000 UTC m=+1226.267004945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift") pod "swift-storage-0" (UID: "bc25944b-75d4-4e4e-b1de-57794b5c4bcf") : configmap "swift-ring-files" not found Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.579724 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-lock\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.579938 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-cache\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.602720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rg9k\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-kube-api-access-4rg9k\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.606763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.919455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j28mh" event={"ID":"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1","Type":"ContainerStarted","Data":"05a901e70c829631a65398a5400d201062d524ec9ab9666de970be3a1df6795f"} Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.919595 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerName="dnsmasq-dns" containerID="cri-o://e6dbc46934f79aad30a6ac92846f0a879b3a50e1e2055419d8bba58597942a31" gracePeriod=10 Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.919650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j28mh" event={"ID":"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1","Type":"ContainerStarted","Data":"83dac0a0f2e20cdfba174a39cc6840d921fadade571bd57ae2f829139276b58e"} Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.943868 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-4kkkt"] Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.945533 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.956952 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.956958 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.964134 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 18:38:24 crc kubenswrapper[4909]: I1002 18:38:24.965233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4kkkt"] Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-swiftconf\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088462 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-ring-data-devices\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: E1002 18:38:25.088496 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 18:38:25 crc kubenswrapper[4909]: E1002 18:38:25.088515 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088554 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbt5v\" (UniqueName: \"kubernetes.io/projected/0e9cab60-e149-4118-a3ca-4423620830bf-kube-api-access-nbt5v\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: E1002 18:38:25.088565 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift podName:bc25944b-75d4-4e4e-b1de-57794b5c4bcf nodeName:}" failed. No retries permitted until 2025-10-02 18:38:26.088547527 +0000 UTC m=+1227.276043446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift") pod "swift-storage-0" (UID: "bc25944b-75d4-4e4e-b1de-57794b5c4bcf") : configmap "swift-ring-files" not found Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088602 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-dispersionconf\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088769 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e9cab60-e149-4118-a3ca-4423620830bf-etc-swift\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088844 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-combined-ca-bundle\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.088873 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-scripts\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.192753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-swiftconf\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.194007 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-ring-data-devices\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.194144 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbt5v\" (UniqueName: \"kubernetes.io/projected/0e9cab60-e149-4118-a3ca-4423620830bf-kube-api-access-nbt5v\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.194225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-dispersionconf\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.194423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e9cab60-e149-4118-a3ca-4423620830bf-etc-swift\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.194480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-combined-ca-bundle\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.194516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-scripts\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.195843 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e9cab60-e149-4118-a3ca-4423620830bf-etc-swift\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.196116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-ring-data-devices\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.196761 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-swiftconf\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.197081 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-scripts\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.198791 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-dispersionconf\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.205188 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-combined-ca-bundle\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.214408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbt5v\" (UniqueName: \"kubernetes.io/projected/0e9cab60-e149-4118-a3ca-4423620830bf-kube-api-access-nbt5v\") pod \"swift-ring-rebalance-4kkkt\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.268584 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.589750 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4kkkt"] Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.930977 4909 generic.go:334] "Generic (PLEG): container finished" podID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerID="05a901e70c829631a65398a5400d201062d524ec9ab9666de970be3a1df6795f" exitCode=0 Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.931161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j28mh" event={"ID":"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1","Type":"ContainerDied","Data":"05a901e70c829631a65398a5400d201062d524ec9ab9666de970be3a1df6795f"} Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.933443 4909 generic.go:334] "Generic (PLEG): container finished" podID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerID="e6dbc46934f79aad30a6ac92846f0a879b3a50e1e2055419d8bba58597942a31" exitCode=0 Oct 02 18:38:25 crc kubenswrapper[4909]: I1002 18:38:25.933484 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" event={"ID":"d67393e0-ece8-4934-bc68-2c5f4be3e92b","Type":"ContainerDied","Data":"e6dbc46934f79aad30a6ac92846f0a879b3a50e1e2055419d8bba58597942a31"} Oct 02 18:38:26 crc kubenswrapper[4909]: I1002 18:38:26.115254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:26 crc kubenswrapper[4909]: E1002 18:38:26.115875 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 18:38:26 crc kubenswrapper[4909]: E1002 18:38:26.115891 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 18:38:26 crc kubenswrapper[4909]: E1002 18:38:26.115926 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift podName:bc25944b-75d4-4e4e-b1de-57794b5c4bcf nodeName:}" failed. No retries permitted until 2025-10-02 18:38:28.11591314 +0000 UTC m=+1229.303408999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift") pod "swift-storage-0" (UID: "bc25944b-75d4-4e4e-b1de-57794b5c4bcf") : configmap "swift-ring-files" not found Oct 02 18:38:30 crc kubenswrapper[4909]: W1002 18:38:27.796679 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e9cab60_e149_4118_a3ca_4423620830bf.slice/crio-35c586c3115c6ed3eef598f82de3f0f7202929df7f96a748e07a75bf94fe1e9e WatchSource:0}: Error finding container 35c586c3115c6ed3eef598f82de3f0f7202929df7f96a748e07a75bf94fe1e9e: Status 404 returned error can't find the container with id 35c586c3115c6ed3eef598f82de3f0f7202929df7f96a748e07a75bf94fe1e9e Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:27.803115 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:27.878541 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:27.950924 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4kkkt" event={"ID":"0e9cab60-e149-4118-a3ca-4423620830bf","Type":"ContainerStarted","Data":"35c586c3115c6ed3eef598f82de3f0f7202929df7f96a748e07a75bf94fe1e9e"} Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:27.954442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" event={"ID":"d67393e0-ece8-4934-bc68-2c5f4be3e92b","Type":"ContainerDied","Data":"a2b798576f38613e1487054f9ff96a5b2b57c2f4c842372287a5ce3fad13783e"} Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:27.954503 4909 scope.go:117] "RemoveContainer" containerID="e6dbc46934f79aad30a6ac92846f0a879b3a50e1e2055419d8bba58597942a31" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:27.954541 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-n94sx" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.055546 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zs7s\" (UniqueName: \"kubernetes.io/projected/d67393e0-ece8-4934-bc68-2c5f4be3e92b-kube-api-access-6zs7s\") pod \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.055616 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-dns-svc\") pod \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.055697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-config\") pod \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.055785 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-nb\") pod \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.055900 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-sb\") pod \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\" (UID: \"d67393e0-ece8-4934-bc68-2c5f4be3e92b\") " Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.061091 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67393e0-ece8-4934-bc68-2c5f4be3e92b-kube-api-access-6zs7s" (OuterVolumeSpecName: "kube-api-access-6zs7s") pod "d67393e0-ece8-4934-bc68-2c5f4be3e92b" (UID: "d67393e0-ece8-4934-bc68-2c5f4be3e92b"). InnerVolumeSpecName "kube-api-access-6zs7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.101639 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d67393e0-ece8-4934-bc68-2c5f4be3e92b" (UID: "d67393e0-ece8-4934-bc68-2c5f4be3e92b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.103072 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d67393e0-ece8-4934-bc68-2c5f4be3e92b" (UID: "d67393e0-ece8-4934-bc68-2c5f4be3e92b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.112649 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d67393e0-ece8-4934-bc68-2c5f4be3e92b" (UID: "d67393e0-ece8-4934-bc68-2c5f4be3e92b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.117549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-config" (OuterVolumeSpecName: "config") pod "d67393e0-ece8-4934-bc68-2c5f4be3e92b" (UID: "d67393e0-ece8-4934-bc68-2c5f4be3e92b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.157691 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.157872 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.157886 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.157896 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.157909 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zs7s\" (UniqueName: \"kubernetes.io/projected/d67393e0-ece8-4934-bc68-2c5f4be3e92b-kube-api-access-6zs7s\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.157918 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67393e0-ece8-4934-bc68-2c5f4be3e92b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:30 crc kubenswrapper[4909]: E1002 18:38:28.158076 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 18:38:30 crc kubenswrapper[4909]: E1002 18:38:28.158113 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 18:38:30 crc kubenswrapper[4909]: E1002 18:38:28.158174 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift podName:bc25944b-75d4-4e4e-b1de-57794b5c4bcf nodeName:}" failed. No retries permitted until 2025-10-02 18:38:32.158153232 +0000 UTC m=+1233.345649161 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift") pod "swift-storage-0" (UID: "bc25944b-75d4-4e4e-b1de-57794b5c4bcf") : configmap "swift-ring-files" not found Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.297190 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-n94sx"] Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:28.305646 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-n94sx"] Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:29.625274 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" path="/var/lib/kubelet/pods/d67393e0-ece8-4934-bc68-2c5f4be3e92b/volumes" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:29.711517 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-b6c9598fb-p2t8g" podUID="2806f9d6-de1a-4c32-bec7-c06e06884bd5" containerName="console" containerID="cri-o://61ada46d0fb7375d25a32ca0f28d75d250d69abf72e3b72b1a834a7ea297c9dd" gracePeriod=15 Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:29.975022 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6c9598fb-p2t8g_2806f9d6-de1a-4c32-bec7-c06e06884bd5/console/0.log" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:29.975226 4909 generic.go:334] "Generic (PLEG): container finished" podID="2806f9d6-de1a-4c32-bec7-c06e06884bd5" containerID="61ada46d0fb7375d25a32ca0f28d75d250d69abf72e3b72b1a834a7ea297c9dd" exitCode=2 Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:29.975323 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c9598fb-p2t8g" event={"ID":"2806f9d6-de1a-4c32-bec7-c06e06884bd5","Type":"ContainerDied","Data":"61ada46d0fb7375d25a32ca0f28d75d250d69abf72e3b72b1a834a7ea297c9dd"} Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:29.982713 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.039440 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="f308ff37-5d0f-4b6f-8b67-1ab86795e820" containerName="galera" probeResult="failure" output=< Oct 02 18:38:30 crc kubenswrapper[4909]: wsrep_local_state_comment (Joined) differs from Synced Oct 02 18:38:30 crc kubenswrapper[4909]: > Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.492264 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.888587 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6z9g5"] Oct 02 18:38:30 crc kubenswrapper[4909]: E1002 18:38:30.889253 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerName="init" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.889274 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerName="init" Oct 02 18:38:30 crc kubenswrapper[4909]: E1002 18:38:30.889293 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerName="dnsmasq-dns" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.889301 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerName="dnsmasq-dns" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.889560 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67393e0-ece8-4934-bc68-2c5f4be3e92b" containerName="dnsmasq-dns" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.890353 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.895377 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6z9g5"] Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.983117 4909 generic.go:334] "Generic (PLEG): container finished" podID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerID="f418b687441436b3a07b048f6faffb535085d2fc78e0e54a8616c84206f2f045" exitCode=0 Oct 02 18:38:30 crc kubenswrapper[4909]: I1002 18:38:30.983194 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d73c266b-a3db-431b-a40f-f0a5b9d06610","Type":"ContainerDied","Data":"f418b687441436b3a07b048f6faffb535085d2fc78e0e54a8616c84206f2f045"} Oct 02 18:38:31 crc kubenswrapper[4909]: I1002 18:38:31.015794 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbsc\" (UniqueName: \"kubernetes.io/projected/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1-kube-api-access-lwbsc\") pod \"placement-db-create-6z9g5\" (UID: \"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1\") " pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:31 crc kubenswrapper[4909]: I1002 18:38:31.117999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbsc\" (UniqueName: \"kubernetes.io/projected/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1-kube-api-access-lwbsc\") pod \"placement-db-create-6z9g5\" (UID: \"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1\") " pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:31 crc kubenswrapper[4909]: I1002 18:38:31.140819 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbsc\" (UniqueName: \"kubernetes.io/projected/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1-kube-api-access-lwbsc\") pod \"placement-db-create-6z9g5\" (UID: \"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1\") " pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:31 crc kubenswrapper[4909]: I1002 18:38:31.213767 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:32 crc kubenswrapper[4909]: I1002 18:38:32.242887 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:32 crc kubenswrapper[4909]: E1002 18:38:32.243108 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 18:38:32 crc kubenswrapper[4909]: E1002 18:38:32.243242 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 18:38:32 crc kubenswrapper[4909]: E1002 18:38:32.243293 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift podName:bc25944b-75d4-4e4e-b1de-57794b5c4bcf nodeName:}" failed. No retries permitted until 2025-10-02 18:38:40.243276448 +0000 UTC m=+1241.430772317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift") pod "swift-storage-0" (UID: "bc25944b-75d4-4e4e-b1de-57794b5c4bcf") : configmap "swift-ring-files" not found Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.000283 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5dqlm"] Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.002286 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.022573 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5dqlm"] Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.159927 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2lx7\" (UniqueName: \"kubernetes.io/projected/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225-kube-api-access-p2lx7\") pod \"mysqld-exporter-openstack-db-create-5dqlm\" (UID: \"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225\") " pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.262324 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2lx7\" (UniqueName: \"kubernetes.io/projected/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225-kube-api-access-p2lx7\") pod \"mysqld-exporter-openstack-db-create-5dqlm\" (UID: \"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225\") " pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.290503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2lx7\" (UniqueName: \"kubernetes.io/projected/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225-kube-api-access-p2lx7\") pod \"mysqld-exporter-openstack-db-create-5dqlm\" (UID: \"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225\") " pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:33 crc kubenswrapper[4909]: I1002 18:38:33.349376 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:34 crc kubenswrapper[4909]: E1002 18:38:34.173086 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00" Oct 02 18:38:34 crc kubenswrapper[4909]: E1002 18:38:34.173593 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00,Command:[],Args:[--web.console.templates=/etc/prometheus/consoles --web.console.libraries=/etc/prometheus/console_libraries --config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dkvgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(7485bfaa-555b-4469-9681-bc735a109726): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.566619 4909 scope.go:117] "RemoveContainer" containerID="893bf2dadbbb04a2389b14d7f7691ac40f06774e70463e550974d8c417b061ac" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.912316 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6c9598fb-p2t8g_2806f9d6-de1a-4c32-bec7-c06e06884bd5/console/0.log" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.912625 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.995955 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-config\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.996567 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-oauth-config\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.996588 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-oauth-serving-cert\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.996627 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-service-ca\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.996651 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-serving-cert\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.996675 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjc2g\" (UniqueName: \"kubernetes.io/projected/2806f9d6-de1a-4c32-bec7-c06e06884bd5-kube-api-access-wjc2g\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.996717 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-trusted-ca-bundle\") pod \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\" (UID: \"2806f9d6-de1a-4c32-bec7-c06e06884bd5\") " Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.997380 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-config" (OuterVolumeSpecName: "console-config") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.997746 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-service-ca" (OuterVolumeSpecName: "service-ca") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.997900 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:34 crc kubenswrapper[4909]: I1002 18:38:34.998841 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.001684 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.002215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.003203 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2806f9d6-de1a-4c32-bec7-c06e06884bd5-kube-api-access-wjc2g" (OuterVolumeSpecName: "kube-api-access-wjc2g") pod "2806f9d6-de1a-4c32-bec7-c06e06884bd5" (UID: "2806f9d6-de1a-4c32-bec7-c06e06884bd5"). InnerVolumeSpecName "kube-api-access-wjc2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.031320 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d73c266b-a3db-431b-a40f-f0a5b9d06610","Type":"ContainerStarted","Data":"670db3e1871ba87e849dc034f28ebc741ee8f76e51481d6a9a67e91fe691d88b"} Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.031544 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.035871 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6c9598fb-p2t8g_2806f9d6-de1a-4c32-bec7-c06e06884bd5/console/0.log" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.035950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c9598fb-p2t8g" event={"ID":"2806f9d6-de1a-4c32-bec7-c06e06884bd5","Type":"ContainerDied","Data":"8c4dc6d2bef4fd35fb03c5caefe247527bc36d6d3b92f5e6f1f7de3a9b4d67ff"} Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.035966 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c9598fb-p2t8g" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.035986 4909 scope.go:117] "RemoveContainer" containerID="61ada46d0fb7375d25a32ca0f28d75d250d69abf72e3b72b1a834a7ea297c9dd" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.038775 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5db23d92-24c2-4ee0-a489-adbb1c9cc04e","Type":"ContainerStarted","Data":"f825c0282ea5cfaa890ef7962cb70418325a8910c22b1ea1774a090219cb8b8f"} Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.043691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j28mh" event={"ID":"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1","Type":"ContainerStarted","Data":"0e13f289e4db459670b65411637693512c793cd459cbfe0ac5cf0f258403c830"} Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.043953 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.062281 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.724605688 podStartE2EDuration="59.062262082s" podCreationTimestamp="2025-10-02 18:37:36 +0000 UTC" firstStartedPulling="2025-10-02 18:37:38.12800512 +0000 UTC m=+1179.315500979" lastFinishedPulling="2025-10-02 18:37:56.465661514 +0000 UTC m=+1197.653157373" observedRunningTime="2025-10-02 18:38:35.055913144 +0000 UTC m=+1236.243409023" watchObservedRunningTime="2025-10-02 18:38:35.062262082 +0000 UTC m=+1236.249757961" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098851 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b6c9598fb-p2t8g"] Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098909 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098933 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098942 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098951 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjc2g\" (UniqueName: \"kubernetes.io/projected/2806f9d6-de1a-4c32-bec7-c06e06884bd5-kube-api-access-wjc2g\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098960 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098969 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.098978 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2806f9d6-de1a-4c32-bec7-c06e06884bd5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.106274 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b6c9598fb-p2t8g"] Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.111099 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-j28mh" podStartSLOduration=12.111069427 podStartE2EDuration="12.111069427s" podCreationTimestamp="2025-10-02 18:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:38:35.086178369 +0000 UTC m=+1236.273674228" watchObservedRunningTime="2025-10-02 18:38:35.111069427 +0000 UTC m=+1236.298565286" Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.142529 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6z9g5"] Oct 02 18:38:35 crc kubenswrapper[4909]: W1002 18:38:35.145381 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97095a9f_e6b9_49d3_8fd5_684e40e9d6b1.slice/crio-43fab032a42789708831741e2024d71b5bba6dcbdb40e162d126166fd5042e0b WatchSource:0}: Error finding container 43fab032a42789708831741e2024d71b5bba6dcbdb40e162d126166fd5042e0b: Status 404 returned error can't find the container with id 43fab032a42789708831741e2024d71b5bba6dcbdb40e162d126166fd5042e0b Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.238070 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5dqlm"] Oct 02 18:38:35 crc kubenswrapper[4909]: W1002 18:38:35.245138 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d4ccd2_92a9_4408_b2d9_dc8c1a2d9225.slice/crio-31f5d670b8eaa7ba27cc7eff52ed9b4686cb984a4248a900699ef97825ca008b WatchSource:0}: Error finding container 31f5d670b8eaa7ba27cc7eff52ed9b4686cb984a4248a900699ef97825ca008b: Status 404 returned error can't find the container with id 31f5d670b8eaa7ba27cc7eff52ed9b4686cb984a4248a900699ef97825ca008b Oct 02 18:38:35 crc kubenswrapper[4909]: I1002 18:38:35.626920 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2806f9d6-de1a-4c32-bec7-c06e06884bd5" path="/var/lib/kubelet/pods/2806f9d6-de1a-4c32-bec7-c06e06884bd5/volumes" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.054665 4909 generic.go:334] "Generic (PLEG): container finished" podID="97095a9f-e6b9-49d3-8fd5-684e40e9d6b1" containerID="b6a1d19b9e9b07e6d125ce702b321533ea946c61ba2d7a8c025ff5343d287de4" exitCode=0 Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.054876 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6z9g5" event={"ID":"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1","Type":"ContainerDied","Data":"b6a1d19b9e9b07e6d125ce702b321533ea946c61ba2d7a8c025ff5343d287de4"} Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.055004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6z9g5" event={"ID":"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1","Type":"ContainerStarted","Data":"43fab032a42789708831741e2024d71b5bba6dcbdb40e162d126166fd5042e0b"} Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.056773 4909 generic.go:334] "Generic (PLEG): container finished" podID="85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225" containerID="69dde16e06e8bb96685e09b5e31cf359492b6ea95a0e979552adcadfbce9b2b9" exitCode=0 Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.056814 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" event={"ID":"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225","Type":"ContainerDied","Data":"69dde16e06e8bb96685e09b5e31cf359492b6ea95a0e979552adcadfbce9b2b9"} Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.056830 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" event={"ID":"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225","Type":"ContainerStarted","Data":"31f5d670b8eaa7ba27cc7eff52ed9b4686cb984a4248a900699ef97825ca008b"} Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.061289 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5db23d92-24c2-4ee0-a489-adbb1c9cc04e","Type":"ContainerStarted","Data":"38880b9040ee29de5c536662c477bfacedb5fa3ac97c2ce4cabee36a192c6717"} Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.061718 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.097253 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.577985178 podStartE2EDuration="15.097235422s" podCreationTimestamp="2025-10-02 18:38:21 +0000 UTC" firstStartedPulling="2025-10-02 18:38:22.076547943 +0000 UTC m=+1223.264043802" lastFinishedPulling="2025-10-02 18:38:34.595798187 +0000 UTC m=+1235.783294046" observedRunningTime="2025-10-02 18:38:36.088942192 +0000 UTC m=+1237.276438071" watchObservedRunningTime="2025-10-02 18:38:36.097235422 +0000 UTC m=+1237.284731271" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.215601 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kvwsj"] Oct 02 18:38:36 crc kubenswrapper[4909]: E1002 18:38:36.216164 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2806f9d6-de1a-4c32-bec7-c06e06884bd5" containerName="console" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.216186 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2806f9d6-de1a-4c32-bec7-c06e06884bd5" containerName="console" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.216365 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2806f9d6-de1a-4c32-bec7-c06e06884bd5" containerName="console" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.217263 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.227927 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kvwsj"] Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.335045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ftln\" (UniqueName: \"kubernetes.io/projected/8c24f10a-b08b-41f8-a71c-c406f531ea88-kube-api-access-9ftln\") pod \"glance-db-create-kvwsj\" (UID: \"8c24f10a-b08b-41f8-a71c-c406f531ea88\") " pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.436870 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ftln\" (UniqueName: \"kubernetes.io/projected/8c24f10a-b08b-41f8-a71c-c406f531ea88-kube-api-access-9ftln\") pod \"glance-db-create-kvwsj\" (UID: \"8c24f10a-b08b-41f8-a71c-c406f531ea88\") " pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.481155 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ftln\" (UniqueName: \"kubernetes.io/projected/8c24f10a-b08b-41f8-a71c-c406f531ea88-kube-api-access-9ftln\") pod \"glance-db-create-kvwsj\" (UID: \"8c24f10a-b08b-41f8-a71c-c406f531ea88\") " pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:36 crc kubenswrapper[4909]: I1002 18:38:36.536061 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.652052 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.659012 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.760779 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2lx7\" (UniqueName: \"kubernetes.io/projected/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225-kube-api-access-p2lx7\") pod \"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225\" (UID: \"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225\") " Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.760977 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwbsc\" (UniqueName: \"kubernetes.io/projected/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1-kube-api-access-lwbsc\") pod \"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1\" (UID: \"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1\") " Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.765364 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225-kube-api-access-p2lx7" (OuterVolumeSpecName: "kube-api-access-p2lx7") pod "85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225" (UID: "85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225"). InnerVolumeSpecName "kube-api-access-p2lx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.766918 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1-kube-api-access-lwbsc" (OuterVolumeSpecName: "kube-api-access-lwbsc") pod "97095a9f-e6b9-49d3-8fd5-684e40e9d6b1" (UID: "97095a9f-e6b9-49d3-8fd5-684e40e9d6b1"). InnerVolumeSpecName "kube-api-access-lwbsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.862782 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2lx7\" (UniqueName: \"kubernetes.io/projected/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225-kube-api-access-p2lx7\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:37 crc kubenswrapper[4909]: I1002 18:38:37.862816 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwbsc\" (UniqueName: \"kubernetes.io/projected/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1-kube-api-access-lwbsc\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.038954 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kvwsj"] Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.080969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6z9g5" event={"ID":"97095a9f-e6b9-49d3-8fd5-684e40e9d6b1","Type":"ContainerDied","Data":"43fab032a42789708831741e2024d71b5bba6dcbdb40e162d126166fd5042e0b"} Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.081004 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fab032a42789708831741e2024d71b5bba6dcbdb40e162d126166fd5042e0b" Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.081082 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6z9g5" Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.083905 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kvwsj" event={"ID":"8c24f10a-b08b-41f8-a71c-c406f531ea88","Type":"ContainerStarted","Data":"1d66c9684025a8019aed2ad0cf9973a10ee9ec6cdbfbdcecd18e4488ce93a0af"} Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.087292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" event={"ID":"85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225","Type":"ContainerDied","Data":"31f5d670b8eaa7ba27cc7eff52ed9b4686cb984a4248a900699ef97825ca008b"} Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.087343 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f5d670b8eaa7ba27cc7eff52ed9b4686cb984a4248a900699ef97825ca008b" Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.087417 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5dqlm" Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.090790 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerStarted","Data":"1dd406d8ce1e070a752a947917123f68ae98f8d1616c91ddc9f0a04673148e7f"} Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.094013 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4kkkt" event={"ID":"0e9cab60-e149-4118-a3ca-4423620830bf","Type":"ContainerStarted","Data":"d5008b9b81b04d9bc5917c2660db3114e376fdb6b8799f6c49f8e5bb8ed8730b"} Oct 02 18:38:38 crc kubenswrapper[4909]: I1002 18:38:38.111746 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-4kkkt" podStartSLOduration=4.371729665 podStartE2EDuration="14.111729867s" podCreationTimestamp="2025-10-02 18:38:24 +0000 UTC" firstStartedPulling="2025-10-02 18:38:27.802754287 +0000 UTC m=+1228.990250166" lastFinishedPulling="2025-10-02 18:38:37.542754509 +0000 UTC m=+1238.730250368" observedRunningTime="2025-10-02 18:38:38.109791677 +0000 UTC m=+1239.297287536" watchObservedRunningTime="2025-10-02 18:38:38.111729867 +0000 UTC m=+1239.299225726" Oct 02 18:38:39 crc kubenswrapper[4909]: I1002 18:38:39.108149 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c24f10a-b08b-41f8-a71c-c406f531ea88" containerID="1b20789f7f0f58cf5f2d9b820d017318853573b3a27353e775066e5665ebe140" exitCode=0 Oct 02 18:38:39 crc kubenswrapper[4909]: I1002 18:38:39.109525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kvwsj" event={"ID":"8c24f10a-b08b-41f8-a71c-c406f531ea88","Type":"ContainerDied","Data":"1b20789f7f0f58cf5f2d9b820d017318853573b3a27353e775066e5665ebe140"} Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.306772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:40 crc kubenswrapper[4909]: E1002 18:38:40.307122 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 18:38:40 crc kubenswrapper[4909]: E1002 18:38:40.307151 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 18:38:40 crc kubenswrapper[4909]: E1002 18:38:40.307213 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift podName:bc25944b-75d4-4e4e-b1de-57794b5c4bcf nodeName:}" failed. No retries permitted until 2025-10-02 18:38:56.307195598 +0000 UTC m=+1257.494691457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift") pod "swift-storage-0" (UID: "bc25944b-75d4-4e4e-b1de-57794b5c4bcf") : configmap "swift-ring-files" not found Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.501451 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-kc8f6"] Oct 02 18:38:40 crc kubenswrapper[4909]: E1002 18:38:40.501873 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225" containerName="mariadb-database-create" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.501892 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225" containerName="mariadb-database-create" Oct 02 18:38:40 crc kubenswrapper[4909]: E1002 18:38:40.501934 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97095a9f-e6b9-49d3-8fd5-684e40e9d6b1" containerName="mariadb-database-create" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.501942 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="97095a9f-e6b9-49d3-8fd5-684e40e9d6b1" containerName="mariadb-database-create" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.502160 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="97095a9f-e6b9-49d3-8fd5-684e40e9d6b1" containerName="mariadb-database-create" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.502187 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225" containerName="mariadb-database-create" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.502961 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.518014 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kc8f6"] Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.612325 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgjw2\" (UniqueName: \"kubernetes.io/projected/11e0572e-0def-4e7e-958d-8272391b538d-kube-api-access-xgjw2\") pod \"keystone-db-create-kc8f6\" (UID: \"11e0572e-0def-4e7e-958d-8272391b538d\") " pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.713533 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgjw2\" (UniqueName: \"kubernetes.io/projected/11e0572e-0def-4e7e-958d-8272391b538d-kube-api-access-xgjw2\") pod \"keystone-db-create-kc8f6\" (UID: \"11e0572e-0def-4e7e-958d-8272391b538d\") " pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.733368 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgjw2\" (UniqueName: \"kubernetes.io/projected/11e0572e-0def-4e7e-958d-8272391b538d-kube-api-access-xgjw2\") pod \"keystone-db-create-kc8f6\" (UID: \"11e0572e-0def-4e7e-958d-8272391b538d\") " pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.815118 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.842518 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.916943 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ftln\" (UniqueName: \"kubernetes.io/projected/8c24f10a-b08b-41f8-a71c-c406f531ea88-kube-api-access-9ftln\") pod \"8c24f10a-b08b-41f8-a71c-c406f531ea88\" (UID: \"8c24f10a-b08b-41f8-a71c-c406f531ea88\") " Oct 02 18:38:40 crc kubenswrapper[4909]: I1002 18:38:40.920374 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c24f10a-b08b-41f8-a71c-c406f531ea88-kube-api-access-9ftln" (OuterVolumeSpecName: "kube-api-access-9ftln") pod "8c24f10a-b08b-41f8-a71c-c406f531ea88" (UID: "8c24f10a-b08b-41f8-a71c-c406f531ea88"). InnerVolumeSpecName "kube-api-access-9ftln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:40 crc kubenswrapper[4909]: E1002 18:38:40.926885 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.018817 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ftln\" (UniqueName: \"kubernetes.io/projected/8c24f10a-b08b-41f8-a71c-c406f531ea88-kube-api-access-9ftln\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.124373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kvwsj" event={"ID":"8c24f10a-b08b-41f8-a71c-c406f531ea88","Type":"ContainerDied","Data":"1d66c9684025a8019aed2ad0cf9973a10ee9ec6cdbfbdcecd18e4488ce93a0af"} Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.124422 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d66c9684025a8019aed2ad0cf9973a10ee9ec6cdbfbdcecd18e4488ce93a0af" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.124389 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kvwsj" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.126841 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerStarted","Data":"46489191a0dfcafaea5edd7abbdae4ba42500ceb3e817660f8e296eb04493bf7"} Oct 02 18:38:41 crc kubenswrapper[4909]: E1002 18:38:41.129395 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.294339 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kc8f6"] Oct 02 18:38:41 crc kubenswrapper[4909]: W1002 18:38:41.308001 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11e0572e_0def_4e7e_958d_8272391b538d.slice/crio-358f0525e29e9c9771a5e17abbb8f9777c229887763924782532356ab326f432 WatchSource:0}: Error finding container 358f0525e29e9c9771a5e17abbb8f9777c229887763924782532356ab326f432: Status 404 returned error can't find the container with id 358f0525e29e9c9771a5e17abbb8f9777c229887763924782532356ab326f432 Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.585393 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sj5pf" podUID="9f1ef01b-ae9b-4a58-a411-d2e2e5770742" containerName="ovn-controller" probeResult="failure" output=< Oct 02 18:38:41 crc kubenswrapper[4909]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 18:38:41 crc kubenswrapper[4909]: > Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.600699 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.606963 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wz8jk" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.845447 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sj5pf-config-764tv"] Oct 02 18:38:41 crc kubenswrapper[4909]: E1002 18:38:41.845791 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c24f10a-b08b-41f8-a71c-c406f531ea88" containerName="mariadb-database-create" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.845807 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c24f10a-b08b-41f8-a71c-c406f531ea88" containerName="mariadb-database-create" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.845995 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c24f10a-b08b-41f8-a71c-c406f531ea88" containerName="mariadb-database-create" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.846580 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.848842 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.865759 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sj5pf-config-764tv"] Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.944932 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6m24\" (UniqueName: \"kubernetes.io/projected/e1d38c79-ac75-40be-b248-7c1438bc1c1f-kube-api-access-b6m24\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.945127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-log-ovn\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.945217 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.945318 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-additional-scripts\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.945402 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run-ovn\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:41 crc kubenswrapper[4909]: I1002 18:38:41.945569 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-scripts\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.047818 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-additional-scripts\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.047895 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run-ovn\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.047966 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-scripts\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6m24\" (UniqueName: \"kubernetes.io/projected/e1d38c79-ac75-40be-b248-7c1438bc1c1f-kube-api-access-b6m24\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-log-ovn\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048184 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-log-ovn\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run-ovn\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.048809 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-additional-scripts\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.050377 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-scripts\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.068053 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6m24\" (UniqueName: \"kubernetes.io/projected/e1d38c79-ac75-40be-b248-7c1438bc1c1f-kube-api-access-b6m24\") pod \"ovn-controller-sj5pf-config-764tv\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.136821 4909 generic.go:334] "Generic (PLEG): container finished" podID="11e0572e-0def-4e7e-958d-8272391b538d" containerID="f51ffe3c0aed78cee8fcf6f09387b423ed2ab81c715f72dd09bcbf02c5606446" exitCode=0 Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.136946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kc8f6" event={"ID":"11e0572e-0def-4e7e-958d-8272391b538d","Type":"ContainerDied","Data":"f51ffe3c0aed78cee8fcf6f09387b423ed2ab81c715f72dd09bcbf02c5606446"} Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.137004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kc8f6" event={"ID":"11e0572e-0def-4e7e-958d-8272391b538d","Type":"ContainerStarted","Data":"358f0525e29e9c9771a5e17abbb8f9777c229887763924782532356ab326f432"} Oct 02 18:38:42 crc kubenswrapper[4909]: E1002 18:38:42.139987 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.202529 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:42 crc kubenswrapper[4909]: I1002 18:38:42.711850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sj5pf-config-764tv"] Oct 02 18:38:42 crc kubenswrapper[4909]: W1002 18:38:42.785991 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1d38c79_ac75_40be_b248_7c1438bc1c1f.slice/crio-cb55ead5032bebf8d693afac9b9061fbe8905e274ed05036fb724b6d255b3f77 WatchSource:0}: Error finding container cb55ead5032bebf8d693afac9b9061fbe8905e274ed05036fb724b6d255b3f77: Status 404 returned error can't find the container with id cb55ead5032bebf8d693afac9b9061fbe8905e274ed05036fb724b6d255b3f77 Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.134350 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-4f3e-account-create-gdbfn"] Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.135764 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.138423 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.148740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sj5pf-config-764tv" event={"ID":"e1d38c79-ac75-40be-b248-7c1438bc1c1f","Type":"ContainerStarted","Data":"8f7ad0bec3f246f980174fcd5828e9eb8614997292f5de07bfa1aa0b5c83b23e"} Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.148808 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sj5pf-config-764tv" event={"ID":"e1d38c79-ac75-40be-b248-7c1438bc1c1f","Type":"ContainerStarted","Data":"cb55ead5032bebf8d693afac9b9061fbe8905e274ed05036fb724b6d255b3f77"} Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.149764 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-4f3e-account-create-gdbfn"] Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.187193 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sj5pf-config-764tv" podStartSLOduration=2.187174487 podStartE2EDuration="2.187174487s" podCreationTimestamp="2025-10-02 18:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:38:43.183964896 +0000 UTC m=+1244.371460765" watchObservedRunningTime="2025-10-02 18:38:43.187174487 +0000 UTC m=+1244.374670346" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.287109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8clc\" (UniqueName: \"kubernetes.io/projected/8ff1a7f2-7ef1-401d-9597-52682290af50-kube-api-access-w8clc\") pod \"mysqld-exporter-4f3e-account-create-gdbfn\" (UID: \"8ff1a7f2-7ef1-401d-9597-52682290af50\") " pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.388362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8clc\" (UniqueName: \"kubernetes.io/projected/8ff1a7f2-7ef1-401d-9597-52682290af50-kube-api-access-w8clc\") pod \"mysqld-exporter-4f3e-account-create-gdbfn\" (UID: \"8ff1a7f2-7ef1-401d-9597-52682290af50\") " pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.415934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8clc\" (UniqueName: \"kubernetes.io/projected/8ff1a7f2-7ef1-401d-9597-52682290af50-kube-api-access-w8clc\") pod \"mysqld-exporter-4f3e-account-create-gdbfn\" (UID: \"8ff1a7f2-7ef1-401d-9597-52682290af50\") " pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.463180 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.530530 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.605385 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jz7g2"] Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.605839 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerName="dnsmasq-dns" containerID="cri-o://716e80106b4a3c6ae0c559b68492ff4badbab0591931bc77196fc713177cf1e0" gracePeriod=10 Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.607067 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.693833 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgjw2\" (UniqueName: \"kubernetes.io/projected/11e0572e-0def-4e7e-958d-8272391b538d-kube-api-access-xgjw2\") pod \"11e0572e-0def-4e7e-958d-8272391b538d\" (UID: \"11e0572e-0def-4e7e-958d-8272391b538d\") " Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.714353 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e0572e-0def-4e7e-958d-8272391b538d-kube-api-access-xgjw2" (OuterVolumeSpecName: "kube-api-access-xgjw2") pod "11e0572e-0def-4e7e-958d-8272391b538d" (UID: "11e0572e-0def-4e7e-958d-8272391b538d"). InnerVolumeSpecName "kube-api-access-xgjw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.796294 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgjw2\" (UniqueName: \"kubernetes.io/projected/11e0572e-0def-4e7e-958d-8272391b538d-kube-api-access-xgjw2\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:43 crc kubenswrapper[4909]: I1002 18:38:43.960447 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-4f3e-account-create-gdbfn"] Oct 02 18:38:43 crc kubenswrapper[4909]: W1002 18:38:43.970190 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ff1a7f2_7ef1_401d_9597_52682290af50.slice/crio-00e723e1685750eb484704007a36fc828637d3fd4df33739c51bb85b3bf9329a WatchSource:0}: Error finding container 00e723e1685750eb484704007a36fc828637d3fd4df33739c51bb85b3bf9329a: Status 404 returned error can't find the container with id 00e723e1685750eb484704007a36fc828637d3fd4df33739c51bb85b3bf9329a Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.157613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" event={"ID":"8ff1a7f2-7ef1-401d-9597-52682290af50","Type":"ContainerStarted","Data":"00e723e1685750eb484704007a36fc828637d3fd4df33739c51bb85b3bf9329a"} Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.159606 4909 generic.go:334] "Generic (PLEG): container finished" podID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerID="716e80106b4a3c6ae0c559b68492ff4badbab0591931bc77196fc713177cf1e0" exitCode=0 Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.159637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" event={"ID":"ef353dfc-8f35-483c-ad2a-2df34420381f","Type":"ContainerDied","Data":"716e80106b4a3c6ae0c559b68492ff4badbab0591931bc77196fc713177cf1e0"} Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.159686 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" event={"ID":"ef353dfc-8f35-483c-ad2a-2df34420381f","Type":"ContainerDied","Data":"37081b51f2c4fdbde2e2eb7dc6e53a96f742c18d55e05c8034871041e6b1239e"} Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.159698 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37081b51f2c4fdbde2e2eb7dc6e53a96f742c18d55e05c8034871041e6b1239e" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.161357 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1d38c79-ac75-40be-b248-7c1438bc1c1f" containerID="8f7ad0bec3f246f980174fcd5828e9eb8614997292f5de07bfa1aa0b5c83b23e" exitCode=0 Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.161396 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sj5pf-config-764tv" event={"ID":"e1d38c79-ac75-40be-b248-7c1438bc1c1f","Type":"ContainerDied","Data":"8f7ad0bec3f246f980174fcd5828e9eb8614997292f5de07bfa1aa0b5c83b23e"} Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.163513 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kc8f6" event={"ID":"11e0572e-0def-4e7e-958d-8272391b538d","Type":"ContainerDied","Data":"358f0525e29e9c9771a5e17abbb8f9777c229887763924782532356ab326f432"} Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.163534 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="358f0525e29e9c9771a5e17abbb8f9777c229887763924782532356ab326f432" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.163583 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kc8f6" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.170084 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.302981 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ppr6\" (UniqueName: \"kubernetes.io/projected/ef353dfc-8f35-483c-ad2a-2df34420381f-kube-api-access-2ppr6\") pod \"ef353dfc-8f35-483c-ad2a-2df34420381f\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.303064 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-config\") pod \"ef353dfc-8f35-483c-ad2a-2df34420381f\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.303103 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-dns-svc\") pod \"ef353dfc-8f35-483c-ad2a-2df34420381f\" (UID: \"ef353dfc-8f35-483c-ad2a-2df34420381f\") " Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.309319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef353dfc-8f35-483c-ad2a-2df34420381f-kube-api-access-2ppr6" (OuterVolumeSpecName: "kube-api-access-2ppr6") pod "ef353dfc-8f35-483c-ad2a-2df34420381f" (UID: "ef353dfc-8f35-483c-ad2a-2df34420381f"). InnerVolumeSpecName "kube-api-access-2ppr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.355196 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef353dfc-8f35-483c-ad2a-2df34420381f" (UID: "ef353dfc-8f35-483c-ad2a-2df34420381f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.363419 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-config" (OuterVolumeSpecName: "config") pod "ef353dfc-8f35-483c-ad2a-2df34420381f" (UID: "ef353dfc-8f35-483c-ad2a-2df34420381f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.405462 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ppr6\" (UniqueName: \"kubernetes.io/projected/ef353dfc-8f35-483c-ad2a-2df34420381f-kube-api-access-2ppr6\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.405513 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:44 crc kubenswrapper[4909]: I1002 18:38:44.405533 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef353dfc-8f35-483c-ad2a-2df34420381f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.182019 4909 generic.go:334] "Generic (PLEG): container finished" podID="0e9cab60-e149-4118-a3ca-4423620830bf" containerID="d5008b9b81b04d9bc5917c2660db3114e376fdb6b8799f6c49f8e5bb8ed8730b" exitCode=0 Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.182215 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4kkkt" event={"ID":"0e9cab60-e149-4118-a3ca-4423620830bf","Type":"ContainerDied","Data":"d5008b9b81b04d9bc5917c2660db3114e376fdb6b8799f6c49f8e5bb8ed8730b"} Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.187875 4909 generic.go:334] "Generic (PLEG): container finished" podID="8ff1a7f2-7ef1-401d-9597-52682290af50" containerID="0604c2babd8d145784807c52844338898fff93fc3dd5c7547f85a4641156a251" exitCode=0 Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.188068 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" event={"ID":"8ff1a7f2-7ef1-401d-9597-52682290af50","Type":"ContainerDied","Data":"0604c2babd8d145784807c52844338898fff93fc3dd5c7547f85a4641156a251"} Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.188207 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jz7g2" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.268902 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jz7g2"] Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.282023 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jz7g2"] Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.619385 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" path="/var/lib/kubelet/pods/ef353dfc-8f35-483c-ad2a-2df34420381f/volumes" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.673985 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.732514 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-log-ovn\") pod \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.732615 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-additional-scripts\") pod \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.732644 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run\") pod \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.732752 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run-ovn\") pod \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.732773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6m24\" (UniqueName: \"kubernetes.io/projected/e1d38c79-ac75-40be-b248-7c1438bc1c1f-kube-api-access-b6m24\") pod \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.732905 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-scripts\") pod \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\" (UID: \"e1d38c79-ac75-40be-b248-7c1438bc1c1f\") " Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.733007 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run" (OuterVolumeSpecName: "var-run") pod "e1d38c79-ac75-40be-b248-7c1438bc1c1f" (UID: "e1d38c79-ac75-40be-b248-7c1438bc1c1f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.733116 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e1d38c79-ac75-40be-b248-7c1438bc1c1f" (UID: "e1d38c79-ac75-40be-b248-7c1438bc1c1f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.733184 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e1d38c79-ac75-40be-b248-7c1438bc1c1f" (UID: "e1d38c79-ac75-40be-b248-7c1438bc1c1f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.733819 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e1d38c79-ac75-40be-b248-7c1438bc1c1f" (UID: "e1d38c79-ac75-40be-b248-7c1438bc1c1f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.734872 4909 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.734894 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.734906 4909 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.734916 4909 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e1d38c79-ac75-40be-b248-7c1438bc1c1f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.734895 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-scripts" (OuterVolumeSpecName: "scripts") pod "e1d38c79-ac75-40be-b248-7c1438bc1c1f" (UID: "e1d38c79-ac75-40be-b248-7c1438bc1c1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.737628 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d38c79-ac75-40be-b248-7c1438bc1c1f-kube-api-access-b6m24" (OuterVolumeSpecName: "kube-api-access-b6m24") pod "e1d38c79-ac75-40be-b248-7c1438bc1c1f" (UID: "e1d38c79-ac75-40be-b248-7c1438bc1c1f"). InnerVolumeSpecName "kube-api-access-b6m24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.837229 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1d38c79-ac75-40be-b248-7c1438bc1c1f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:45 crc kubenswrapper[4909]: I1002 18:38:45.837281 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6m24\" (UniqueName: \"kubernetes.io/projected/e1d38c79-ac75-40be-b248-7c1438bc1c1f-kube-api-access-b6m24\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.205362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sj5pf-config-764tv" event={"ID":"e1d38c79-ac75-40be-b248-7c1438bc1c1f","Type":"ContainerDied","Data":"cb55ead5032bebf8d693afac9b9061fbe8905e274ed05036fb724b6d255b3f77"} Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.205417 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb55ead5032bebf8d693afac9b9061fbe8905e274ed05036fb724b6d255b3f77" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.205493 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sj5pf-config-764tv" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.211046 4909 generic.go:334] "Generic (PLEG): container finished" podID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerID="ba73d188de58df3dffa8c592fb175de9e46bf125cd7abfcd3881fb87ceadff99" exitCode=0 Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.211246 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a0396bfb-ab96-4eb9-af72-e3597ca74ca4","Type":"ContainerDied","Data":"ba73d188de58df3dffa8c592fb175de9e46bf125cd7abfcd3881fb87ceadff99"} Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.592752 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sj5pf" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.656642 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.727579 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.731476 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.750889 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e9cab60-e149-4118-a3ca-4423620830bf-etc-swift\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.750985 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-scripts\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.751081 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-combined-ca-bundle\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.751120 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbt5v\" (UniqueName: \"kubernetes.io/projected/0e9cab60-e149-4118-a3ca-4423620830bf-kube-api-access-nbt5v\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.751153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-swiftconf\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.751202 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-dispersionconf\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.751312 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-ring-data-devices\") pod \"0e9cab60-e149-4118-a3ca-4423620830bf\" (UID: \"0e9cab60-e149-4118-a3ca-4423620830bf\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.753269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.754333 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9cab60-e149-4118-a3ca-4423620830bf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.764348 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9cab60-e149-4118-a3ca-4423620830bf-kube-api-access-nbt5v" (OuterVolumeSpecName: "kube-api-access-nbt5v") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "kube-api-access-nbt5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.790529 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.793639 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.811366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.819046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-scripts" (OuterVolumeSpecName: "scripts") pod "0e9cab60-e149-4118-a3ca-4423620830bf" (UID: "0e9cab60-e149-4118-a3ca-4423620830bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.821238 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sj5pf-config-764tv"] Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.829758 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sj5pf-config-764tv"] Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.853384 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8clc\" (UniqueName: \"kubernetes.io/projected/8ff1a7f2-7ef1-401d-9597-52682290af50-kube-api-access-w8clc\") pod \"8ff1a7f2-7ef1-401d-9597-52682290af50\" (UID: \"8ff1a7f2-7ef1-401d-9597-52682290af50\") " Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854091 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854173 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854230 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbt5v\" (UniqueName: \"kubernetes.io/projected/0e9cab60-e149-4118-a3ca-4423620830bf-kube-api-access-nbt5v\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854281 4909 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854338 4909 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e9cab60-e149-4118-a3ca-4423620830bf-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854405 4909 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e9cab60-e149-4118-a3ca-4423620830bf-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.854458 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e9cab60-e149-4118-a3ca-4423620830bf-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.856988 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff1a7f2-7ef1-401d-9597-52682290af50-kube-api-access-w8clc" (OuterVolumeSpecName: "kube-api-access-w8clc") pod "8ff1a7f2-7ef1-401d-9597-52682290af50" (UID: "8ff1a7f2-7ef1-401d-9597-52682290af50"). InnerVolumeSpecName "kube-api-access-w8clc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:46 crc kubenswrapper[4909]: I1002 18:38:46.956719 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8clc\" (UniqueName: \"kubernetes.io/projected/8ff1a7f2-7ef1-401d-9597-52682290af50-kube-api-access-w8clc\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.220410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" event={"ID":"8ff1a7f2-7ef1-401d-9597-52682290af50","Type":"ContainerDied","Data":"00e723e1685750eb484704007a36fc828637d3fd4df33739c51bb85b3bf9329a"} Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.220741 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00e723e1685750eb484704007a36fc828637d3fd4df33739c51bb85b3bf9329a" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.220420 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-4f3e-account-create-gdbfn" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.222266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a0396bfb-ab96-4eb9-af72-e3597ca74ca4","Type":"ContainerStarted","Data":"15b88cbc58eb05c48efebf39a96fe5676c9884e5c277e9588c948540649bfcf2"} Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.222607 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.223537 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4kkkt" event={"ID":"0e9cab60-e149-4118-a3ca-4423620830bf","Type":"ContainerDied","Data":"35c586c3115c6ed3eef598f82de3f0f7202929df7f96a748e07a75bf94fe1e9e"} Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.223566 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35c586c3115c6ed3eef598f82de3f0f7202929df7f96a748e07a75bf94fe1e9e" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.223687 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4kkkt" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.278870 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371965.575926 podStartE2EDuration="1m11.278849788s" podCreationTimestamp="2025-10-02 18:37:36 +0000 UTC" firstStartedPulling="2025-10-02 18:37:38.565783493 +0000 UTC m=+1179.753279342" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:38:47.273145419 +0000 UTC m=+1248.460641298" watchObservedRunningTime="2025-10-02 18:38:47.278849788 +0000 UTC m=+1248.466345647" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.559307 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:38:47 crc kubenswrapper[4909]: I1002 18:38:47.639951 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d38c79-ac75-40be-b248-7c1438bc1c1f" path="/var/lib/kubelet/pods/e1d38c79-ac75-40be-b248-7c1438bc1c1f/volumes" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.428265 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-v55k7"] Oct 02 18:38:48 crc kubenswrapper[4909]: E1002 18:38:48.428992 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d38c79-ac75-40be-b248-7c1438bc1c1f" containerName="ovn-config" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429008 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d38c79-ac75-40be-b248-7c1438bc1c1f" containerName="ovn-config" Oct 02 18:38:48 crc kubenswrapper[4909]: E1002 18:38:48.429038 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerName="init" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429047 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerName="init" Oct 02 18:38:48 crc kubenswrapper[4909]: E1002 18:38:48.429071 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerName="dnsmasq-dns" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429079 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerName="dnsmasq-dns" Oct 02 18:38:48 crc kubenswrapper[4909]: E1002 18:38:48.429095 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e0572e-0def-4e7e-958d-8272391b538d" containerName="mariadb-database-create" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429102 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e0572e-0def-4e7e-958d-8272391b538d" containerName="mariadb-database-create" Oct 02 18:38:48 crc kubenswrapper[4909]: E1002 18:38:48.429124 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff1a7f2-7ef1-401d-9597-52682290af50" containerName="mariadb-account-create" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429132 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff1a7f2-7ef1-401d-9597-52682290af50" containerName="mariadb-account-create" Oct 02 18:38:48 crc kubenswrapper[4909]: E1002 18:38:48.429148 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9cab60-e149-4118-a3ca-4423620830bf" containerName="swift-ring-rebalance" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429158 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9cab60-e149-4118-a3ca-4423620830bf" containerName="swift-ring-rebalance" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429371 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef353dfc-8f35-483c-ad2a-2df34420381f" containerName="dnsmasq-dns" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429383 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff1a7f2-7ef1-401d-9597-52682290af50" containerName="mariadb-account-create" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429400 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d38c79-ac75-40be-b248-7c1438bc1c1f" containerName="ovn-config" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429417 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9cab60-e149-4118-a3ca-4423620830bf" containerName="swift-ring-rebalance" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.429428 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e0572e-0def-4e7e-958d-8272391b538d" containerName="mariadb-database-create" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.430178 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.439947 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-v55k7"] Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.490244 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnt2\" (UniqueName: \"kubernetes.io/projected/7ed2698b-26a9-4151-b915-ca44218b56b0-kube-api-access-ttnt2\") pod \"mysqld-exporter-openstack-cell1-db-create-v55k7\" (UID: \"7ed2698b-26a9-4151-b915-ca44218b56b0\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.592126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnt2\" (UniqueName: \"kubernetes.io/projected/7ed2698b-26a9-4151-b915-ca44218b56b0-kube-api-access-ttnt2\") pod \"mysqld-exporter-openstack-cell1-db-create-v55k7\" (UID: \"7ed2698b-26a9-4151-b915-ca44218b56b0\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.614800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnt2\" (UniqueName: \"kubernetes.io/projected/7ed2698b-26a9-4151-b915-ca44218b56b0-kube-api-access-ttnt2\") pod \"mysqld-exporter-openstack-cell1-db-create-v55k7\" (UID: \"7ed2698b-26a9-4151-b915-ca44218b56b0\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:48 crc kubenswrapper[4909]: I1002 18:38:48.745676 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:49 crc kubenswrapper[4909]: I1002 18:38:49.236873 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-v55k7"] Oct 02 18:38:49 crc kubenswrapper[4909]: W1002 18:38:49.251915 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ed2698b_26a9_4151_b915_ca44218b56b0.slice/crio-837c0cfd48dad3953d2f8f2fb8137fc7606f37699c71a92f59c95c2f2b1247d8 WatchSource:0}: Error finding container 837c0cfd48dad3953d2f8f2fb8137fc7606f37699c71a92f59c95c2f2b1247d8: Status 404 returned error can't find the container with id 837c0cfd48dad3953d2f8f2fb8137fc7606f37699c71a92f59c95c2f2b1247d8 Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.257766 4909 generic.go:334] "Generic (PLEG): container finished" podID="7ed2698b-26a9-4151-b915-ca44218b56b0" containerID="7176712659b2dbe104d4276a94c5f15329da91bdd2a81e5c0998c73fbac5a4fb" exitCode=0 Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.258009 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" event={"ID":"7ed2698b-26a9-4151-b915-ca44218b56b0","Type":"ContainerDied","Data":"7176712659b2dbe104d4276a94c5f15329da91bdd2a81e5c0998c73fbac5a4fb"} Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.258061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" event={"ID":"7ed2698b-26a9-4151-b915-ca44218b56b0","Type":"ContainerStarted","Data":"837c0cfd48dad3953d2f8f2fb8137fc7606f37699c71a92f59c95c2f2b1247d8"} Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.608521 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-adab-account-create-gtlhs"] Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.610235 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.612221 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.620601 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-adab-account-create-gtlhs"] Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.640673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snk7r\" (UniqueName: \"kubernetes.io/projected/bd5aea08-beac-4fe9-adcf-7db40c37bbba-kube-api-access-snk7r\") pod \"keystone-adab-account-create-gtlhs\" (UID: \"bd5aea08-beac-4fe9-adcf-7db40c37bbba\") " pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.743050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snk7r\" (UniqueName: \"kubernetes.io/projected/bd5aea08-beac-4fe9-adcf-7db40c37bbba-kube-api-access-snk7r\") pod \"keystone-adab-account-create-gtlhs\" (UID: \"bd5aea08-beac-4fe9-adcf-7db40c37bbba\") " pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.763054 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snk7r\" (UniqueName: \"kubernetes.io/projected/bd5aea08-beac-4fe9-adcf-7db40c37bbba-kube-api-access-snk7r\") pod \"keystone-adab-account-create-gtlhs\" (UID: \"bd5aea08-beac-4fe9-adcf-7db40c37bbba\") " pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.941108 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.964469 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0028-account-create-72r8j"] Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.965861 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.976016 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0028-account-create-72r8j"] Oct 02 18:38:50 crc kubenswrapper[4909]: I1002 18:38:50.976293 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.049234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hts\" (UniqueName: \"kubernetes.io/projected/d8724c33-84bd-4420-a55c-b55ae1b7484a-kube-api-access-b6hts\") pod \"placement-0028-account-create-72r8j\" (UID: \"d8724c33-84bd-4420-a55c-b55ae1b7484a\") " pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.151701 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hts\" (UniqueName: \"kubernetes.io/projected/d8724c33-84bd-4420-a55c-b55ae1b7484a-kube-api-access-b6hts\") pod \"placement-0028-account-create-72r8j\" (UID: \"d8724c33-84bd-4420-a55c-b55ae1b7484a\") " pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.178003 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hts\" (UniqueName: \"kubernetes.io/projected/d8724c33-84bd-4420-a55c-b55ae1b7484a-kube-api-access-b6hts\") pod \"placement-0028-account-create-72r8j\" (UID: \"d8724c33-84bd-4420-a55c-b55ae1b7484a\") " pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.351597 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.474278 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-adab-account-create-gtlhs"] Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.638606 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.661049 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnt2\" (UniqueName: \"kubernetes.io/projected/7ed2698b-26a9-4151-b915-ca44218b56b0-kube-api-access-ttnt2\") pod \"7ed2698b-26a9-4151-b915-ca44218b56b0\" (UID: \"7ed2698b-26a9-4151-b915-ca44218b56b0\") " Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.667622 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed2698b-26a9-4151-b915-ca44218b56b0-kube-api-access-ttnt2" (OuterVolumeSpecName: "kube-api-access-ttnt2") pod "7ed2698b-26a9-4151-b915-ca44218b56b0" (UID: "7ed2698b-26a9-4151-b915-ca44218b56b0"). InnerVolumeSpecName "kube-api-access-ttnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.762628 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnt2\" (UniqueName: \"kubernetes.io/projected/7ed2698b-26a9-4151-b915-ca44218b56b0-kube-api-access-ttnt2\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:51 crc kubenswrapper[4909]: I1002 18:38:51.896291 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0028-account-create-72r8j"] Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.275425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" event={"ID":"7ed2698b-26a9-4151-b915-ca44218b56b0","Type":"ContainerDied","Data":"837c0cfd48dad3953d2f8f2fb8137fc7606f37699c71a92f59c95c2f2b1247d8"} Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.275463 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="837c0cfd48dad3953d2f8f2fb8137fc7606f37699c71a92f59c95c2f2b1247d8" Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.275496 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-v55k7" Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.276675 4909 generic.go:334] "Generic (PLEG): container finished" podID="bd5aea08-beac-4fe9-adcf-7db40c37bbba" containerID="d9565192d447669b3876c887730b5767a1d3e7af2bab5bab1cc7be7e2cb5b27f" exitCode=0 Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.276756 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-adab-account-create-gtlhs" event={"ID":"bd5aea08-beac-4fe9-adcf-7db40c37bbba","Type":"ContainerDied","Data":"d9565192d447669b3876c887730b5767a1d3e7af2bab5bab1cc7be7e2cb5b27f"} Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.276794 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-adab-account-create-gtlhs" event={"ID":"bd5aea08-beac-4fe9-adcf-7db40c37bbba","Type":"ContainerStarted","Data":"72fbbbbf4fd7a34da7a890b7303a33f6b257243fa73fda47b4cd56b6dcbe8e3c"} Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.278449 4909 generic.go:334] "Generic (PLEG): container finished" podID="d8724c33-84bd-4420-a55c-b55ae1b7484a" containerID="495a8408b458b80df6bc9eeb217efd129e4ab1e7f081a946ed717dce3570afe6" exitCode=0 Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.278498 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0028-account-create-72r8j" event={"ID":"d8724c33-84bd-4420-a55c-b55ae1b7484a","Type":"ContainerDied","Data":"495a8408b458b80df6bc9eeb217efd129e4ab1e7f081a946ed717dce3570afe6"} Oct 02 18:38:52 crc kubenswrapper[4909]: I1002 18:38:52.278527 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0028-account-create-72r8j" event={"ID":"d8724c33-84bd-4420-a55c-b55ae1b7484a","Type":"ContainerStarted","Data":"ba239d100eecd65cf1b5c8113683463d93f3dbc3cfc652179d2809b9eed26180"} Oct 02 18:38:53 crc kubenswrapper[4909]: I1002 18:38:53.863579 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:53 crc kubenswrapper[4909]: I1002 18:38:53.874673 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:53 crc kubenswrapper[4909]: I1002 18:38:53.904642 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snk7r\" (UniqueName: \"kubernetes.io/projected/bd5aea08-beac-4fe9-adcf-7db40c37bbba-kube-api-access-snk7r\") pod \"bd5aea08-beac-4fe9-adcf-7db40c37bbba\" (UID: \"bd5aea08-beac-4fe9-adcf-7db40c37bbba\") " Oct 02 18:38:53 crc kubenswrapper[4909]: I1002 18:38:53.904913 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6hts\" (UniqueName: \"kubernetes.io/projected/d8724c33-84bd-4420-a55c-b55ae1b7484a-kube-api-access-b6hts\") pod \"d8724c33-84bd-4420-a55c-b55ae1b7484a\" (UID: \"d8724c33-84bd-4420-a55c-b55ae1b7484a\") " Oct 02 18:38:53 crc kubenswrapper[4909]: I1002 18:38:53.915695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5aea08-beac-4fe9-adcf-7db40c37bbba-kube-api-access-snk7r" (OuterVolumeSpecName: "kube-api-access-snk7r") pod "bd5aea08-beac-4fe9-adcf-7db40c37bbba" (UID: "bd5aea08-beac-4fe9-adcf-7db40c37bbba"). InnerVolumeSpecName "kube-api-access-snk7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:53 crc kubenswrapper[4909]: I1002 18:38:53.915812 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8724c33-84bd-4420-a55c-b55ae1b7484a-kube-api-access-b6hts" (OuterVolumeSpecName: "kube-api-access-b6hts") pod "d8724c33-84bd-4420-a55c-b55ae1b7484a" (UID: "d8724c33-84bd-4420-a55c-b55ae1b7484a"). InnerVolumeSpecName "kube-api-access-b6hts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.006787 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6hts\" (UniqueName: \"kubernetes.io/projected/d8724c33-84bd-4420-a55c-b55ae1b7484a-kube-api-access-b6hts\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.006825 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snk7r\" (UniqueName: \"kubernetes.io/projected/bd5aea08-beac-4fe9-adcf-7db40c37bbba-kube-api-access-snk7r\") on node \"crc\" DevicePath \"\"" Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.309574 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0028-account-create-72r8j" event={"ID":"d8724c33-84bd-4420-a55c-b55ae1b7484a","Type":"ContainerDied","Data":"ba239d100eecd65cf1b5c8113683463d93f3dbc3cfc652179d2809b9eed26180"} Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.309653 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba239d100eecd65cf1b5c8113683463d93f3dbc3cfc652179d2809b9eed26180" Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.309606 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0028-account-create-72r8j" Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.312577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-adab-account-create-gtlhs" event={"ID":"bd5aea08-beac-4fe9-adcf-7db40c37bbba","Type":"ContainerDied","Data":"72fbbbbf4fd7a34da7a890b7303a33f6b257243fa73fda47b4cd56b6dcbe8e3c"} Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.312650 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-adab-account-create-gtlhs" Oct 02 18:38:54 crc kubenswrapper[4909]: I1002 18:38:54.312675 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72fbbbbf4fd7a34da7a890b7303a33f6b257243fa73fda47b4cd56b6dcbe8e3c" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.288008 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e2fa-account-create-7lx7n"] Oct 02 18:38:56 crc kubenswrapper[4909]: E1002 18:38:56.289098 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5aea08-beac-4fe9-adcf-7db40c37bbba" containerName="mariadb-account-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.289126 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5aea08-beac-4fe9-adcf-7db40c37bbba" containerName="mariadb-account-create" Oct 02 18:38:56 crc kubenswrapper[4909]: E1002 18:38:56.289171 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8724c33-84bd-4420-a55c-b55ae1b7484a" containerName="mariadb-account-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.289187 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8724c33-84bd-4420-a55c-b55ae1b7484a" containerName="mariadb-account-create" Oct 02 18:38:56 crc kubenswrapper[4909]: E1002 18:38:56.289242 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed2698b-26a9-4151-b915-ca44218b56b0" containerName="mariadb-database-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.289256 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed2698b-26a9-4151-b915-ca44218b56b0" containerName="mariadb-database-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.289657 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed2698b-26a9-4151-b915-ca44218b56b0" containerName="mariadb-database-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.289723 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8724c33-84bd-4420-a55c-b55ae1b7484a" containerName="mariadb-account-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.289765 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5aea08-beac-4fe9-adcf-7db40c37bbba" containerName="mariadb-account-create" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.290978 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.293699 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.303098 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e2fa-account-create-7lx7n"] Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.351932 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.352039 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98st\" (UniqueName: \"kubernetes.io/projected/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb-kube-api-access-z98st\") pod \"glance-e2fa-account-create-7lx7n\" (UID: \"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb\") " pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.368539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc25944b-75d4-4e4e-b1de-57794b5c4bcf-etc-swift\") pod \"swift-storage-0\" (UID: \"bc25944b-75d4-4e4e-b1de-57794b5c4bcf\") " pod="openstack/swift-storage-0" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.454690 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z98st\" (UniqueName: \"kubernetes.io/projected/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb-kube-api-access-z98st\") pod \"glance-e2fa-account-create-7lx7n\" (UID: \"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb\") " pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.475440 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98st\" (UniqueName: \"kubernetes.io/projected/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb-kube-api-access-z98st\") pod \"glance-e2fa-account-create-7lx7n\" (UID: \"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb\") " pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.497351 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 18:38:56 crc kubenswrapper[4909]: I1002 18:38:56.643904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:38:57 crc kubenswrapper[4909]: I1002 18:38:57.148537 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e2fa-account-create-7lx7n"] Oct 02 18:38:57 crc kubenswrapper[4909]: I1002 18:38:57.157690 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 18:38:57 crc kubenswrapper[4909]: I1002 18:38:57.350169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"34ecd43895a09865fc0392ff90b58c9a057ef619e020cdcfbb44003128d320ca"} Oct 02 18:38:57 crc kubenswrapper[4909]: I1002 18:38:57.352758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e2fa-account-create-7lx7n" event={"ID":"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb","Type":"ContainerStarted","Data":"1ba20f82eab5447952b95aa2fcd77639e270058897dcef60fa1611997dc9feab"} Oct 02 18:38:57 crc kubenswrapper[4909]: I1002 18:38:57.875316 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.293260 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-l9hfg"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.294714 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-l9hfg" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.314664 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-l9hfg"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.366538 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-krg67"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.367646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-krg67" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.374965 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-krg67"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.376171 4909 generic.go:334] "Generic (PLEG): container finished" podID="6f9dcfa1-b647-423c-9a3c-0a5f19d133cb" containerID="95bfc97d48c42ed508903f2ed8edb44398bec17e6f67a09adbd85c2c940b2b7b" exitCode=0 Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.376273 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e2fa-account-create-7lx7n" event={"ID":"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb","Type":"ContainerDied","Data":"95bfc97d48c42ed508903f2ed8edb44398bec17e6f67a09adbd85c2c940b2b7b"} Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.387271 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerStarted","Data":"1250894f674860f8866f1fd876542b212c0afd7890d8be24e7b495bea10a377c"} Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.401320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxb4t\" (UniqueName: \"kubernetes.io/projected/0965a26d-8f10-4685-89be-50e446307d30-kube-api-access-kxb4t\") pod \"cinder-db-create-krg67\" (UID: \"0965a26d-8f10-4685-89be-50e446307d30\") " pod="openstack/cinder-db-create-krg67" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.401401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxzf4\" (UniqueName: \"kubernetes.io/projected/3a70c14f-d78d-4190-b731-68d3b606796d-kube-api-access-nxzf4\") pod \"heat-db-create-l9hfg\" (UID: \"3a70c14f-d78d-4190-b731-68d3b606796d\") " pod="openstack/heat-db-create-l9hfg" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.463674 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ftndn"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.464747 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftndn" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.473561 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ftndn"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.506549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r78j\" (UniqueName: \"kubernetes.io/projected/2c112c9e-1a94-4ff4-8b92-c712a835851c-kube-api-access-6r78j\") pod \"barbican-db-create-ftndn\" (UID: \"2c112c9e-1a94-4ff4-8b92-c712a835851c\") " pod="openstack/barbican-db-create-ftndn" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.506662 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxb4t\" (UniqueName: \"kubernetes.io/projected/0965a26d-8f10-4685-89be-50e446307d30-kube-api-access-kxb4t\") pod \"cinder-db-create-krg67\" (UID: \"0965a26d-8f10-4685-89be-50e446307d30\") " pod="openstack/cinder-db-create-krg67" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.506715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxzf4\" (UniqueName: \"kubernetes.io/projected/3a70c14f-d78d-4190-b731-68d3b606796d-kube-api-access-nxzf4\") pod \"heat-db-create-l9hfg\" (UID: \"3a70c14f-d78d-4190-b731-68d3b606796d\") " pod="openstack/heat-db-create-l9hfg" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.522843 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cb4f8"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.523908 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.527145 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zsxbj" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.527212 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.527252 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.529204 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.531331 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxb4t\" (UniqueName: \"kubernetes.io/projected/0965a26d-8f10-4685-89be-50e446307d30-kube-api-access-kxb4t\") pod \"cinder-db-create-krg67\" (UID: \"0965a26d-8f10-4685-89be-50e446307d30\") " pod="openstack/cinder-db-create-krg67" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.541980 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxzf4\" (UniqueName: \"kubernetes.io/projected/3a70c14f-d78d-4190-b731-68d3b606796d-kube-api-access-nxzf4\") pod \"heat-db-create-l9hfg\" (UID: \"3a70c14f-d78d-4190-b731-68d3b606796d\") " pod="openstack/heat-db-create-l9hfg" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.555313 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cb4f8"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.608456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwcdd\" (UniqueName: \"kubernetes.io/projected/9483cd24-6cfe-48cb-a812-8059a42cb41e-kube-api-access-pwcdd\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.608664 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r78j\" (UniqueName: \"kubernetes.io/projected/2c112c9e-1a94-4ff4-8b92-c712a835851c-kube-api-access-6r78j\") pod \"barbican-db-create-ftndn\" (UID: \"2c112c9e-1a94-4ff4-8b92-c712a835851c\") " pod="openstack/barbican-db-create-ftndn" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.609099 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-config-data\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.609380 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-combined-ca-bundle\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.616342 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-l9hfg" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.623961 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r78j\" (UniqueName: \"kubernetes.io/projected/2c112c9e-1a94-4ff4-8b92-c712a835851c-kube-api-access-6r78j\") pod \"barbican-db-create-ftndn\" (UID: \"2c112c9e-1a94-4ff4-8b92-c712a835851c\") " pod="openstack/barbican-db-create-ftndn" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.665177 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-c778-account-create-2slnp"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.666720 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.670904 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.672856 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-z6bdb"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.674008 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z6bdb" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.685770 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-c778-account-create-2slnp"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.687231 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-krg67" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.694807 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z6bdb"] Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.712760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-combined-ca-bundle\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.712817 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz8l6\" (UniqueName: \"kubernetes.io/projected/191d5a8e-d6cd-4538-9dde-2493ed061b25-kube-api-access-mz8l6\") pod \"mysqld-exporter-c778-account-create-2slnp\" (UID: \"191d5a8e-d6cd-4538-9dde-2493ed061b25\") " pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.712850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwcdd\" (UniqueName: \"kubernetes.io/projected/9483cd24-6cfe-48cb-a812-8059a42cb41e-kube-api-access-pwcdd\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.712892 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-config-data\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.712979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2njq\" (UniqueName: \"kubernetes.io/projected/2616ad31-8b46-48ca-a3db-8e4f54596ae6-kube-api-access-b2njq\") pod \"neutron-db-create-z6bdb\" (UID: \"2616ad31-8b46-48ca-a3db-8e4f54596ae6\") " pod="openstack/neutron-db-create-z6bdb" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.718852 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-combined-ca-bundle\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.719832 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-config-data\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.734922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwcdd\" (UniqueName: \"kubernetes.io/projected/9483cd24-6cfe-48cb-a812-8059a42cb41e-kube-api-access-pwcdd\") pod \"keystone-db-sync-cb4f8\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.783345 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftndn" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.814327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8l6\" (UniqueName: \"kubernetes.io/projected/191d5a8e-d6cd-4538-9dde-2493ed061b25-kube-api-access-mz8l6\") pod \"mysqld-exporter-c778-account-create-2slnp\" (UID: \"191d5a8e-d6cd-4538-9dde-2493ed061b25\") " pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.814499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2njq\" (UniqueName: \"kubernetes.io/projected/2616ad31-8b46-48ca-a3db-8e4f54596ae6-kube-api-access-b2njq\") pod \"neutron-db-create-z6bdb\" (UID: \"2616ad31-8b46-48ca-a3db-8e4f54596ae6\") " pod="openstack/neutron-db-create-z6bdb" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.836007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8l6\" (UniqueName: \"kubernetes.io/projected/191d5a8e-d6cd-4538-9dde-2493ed061b25-kube-api-access-mz8l6\") pod \"mysqld-exporter-c778-account-create-2slnp\" (UID: \"191d5a8e-d6cd-4538-9dde-2493ed061b25\") " pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.837255 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2njq\" (UniqueName: \"kubernetes.io/projected/2616ad31-8b46-48ca-a3db-8e4f54596ae6-kube-api-access-b2njq\") pod \"neutron-db-create-z6bdb\" (UID: \"2616ad31-8b46-48ca-a3db-8e4f54596ae6\") " pod="openstack/neutron-db-create-z6bdb" Oct 02 18:38:58 crc kubenswrapper[4909]: I1002 18:38:58.909921 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.079647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.113065 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z6bdb" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.202247 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-l9hfg"] Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.284641 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-krg67"] Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.497240 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.497283 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.506105 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.535291 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.767266287 podStartE2EDuration="1m16.535276358s" podCreationTimestamp="2025-10-02 18:37:43 +0000 UTC" firstStartedPulling="2025-10-02 18:37:57.28622983 +0000 UTC m=+1198.473725689" lastFinishedPulling="2025-10-02 18:38:58.054239881 +0000 UTC m=+1259.241735760" observedRunningTime="2025-10-02 18:38:59.440882978 +0000 UTC m=+1260.628378847" watchObservedRunningTime="2025-10-02 18:38:59.535276358 +0000 UTC m=+1260.722772207" Oct 02 18:38:59 crc kubenswrapper[4909]: I1002 18:38:59.937671 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.057409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z98st\" (UniqueName: \"kubernetes.io/projected/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb-kube-api-access-z98st\") pod \"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb\" (UID: \"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb\") " Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.066568 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb-kube-api-access-z98st" (OuterVolumeSpecName: "kube-api-access-z98st") pod "6f9dcfa1-b647-423c-9a3c-0a5f19d133cb" (UID: "6f9dcfa1-b647-423c-9a3c-0a5f19d133cb"). InnerVolumeSpecName "kube-api-access-z98st". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.159894 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z98st\" (UniqueName: \"kubernetes.io/projected/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb-kube-api-access-z98st\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.425977 4909 generic.go:334] "Generic (PLEG): container finished" podID="0965a26d-8f10-4685-89be-50e446307d30" containerID="8a8e6350ee663fd86448714a6ffb70f643799b988372537e36489ec6a987c4ea" exitCode=0 Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.426071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-krg67" event={"ID":"0965a26d-8f10-4685-89be-50e446307d30","Type":"ContainerDied","Data":"8a8e6350ee663fd86448714a6ffb70f643799b988372537e36489ec6a987c4ea"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.426105 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-krg67" event={"ID":"0965a26d-8f10-4685-89be-50e446307d30","Type":"ContainerStarted","Data":"c13c7033a9cd7abe9a3db6689f88dec70aef5be3205faac5f42e7db852790fe0"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.431675 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e2fa-account-create-7lx7n" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.432161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e2fa-account-create-7lx7n" event={"ID":"6f9dcfa1-b647-423c-9a3c-0a5f19d133cb","Type":"ContainerDied","Data":"1ba20f82eab5447952b95aa2fcd77639e270058897dcef60fa1611997dc9feab"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.432193 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba20f82eab5447952b95aa2fcd77639e270058897dcef60fa1611997dc9feab" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.442681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"513c0ced204c4b36e39f75c0e710a031e0df90c53e58b2ed28d1807e4fbdd12b"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.442739 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"a412911195607159966a9b4edac5484c028a29e8e047f939c164454e352b1761"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.448612 4909 generic.go:334] "Generic (PLEG): container finished" podID="3a70c14f-d78d-4190-b731-68d3b606796d" containerID="3ed746a889a3acf435b0fc46a4e71b8bb2db04afd25d11a2c36fff1bd4270970" exitCode=0 Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.449338 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-l9hfg" event={"ID":"3a70c14f-d78d-4190-b731-68d3b606796d","Type":"ContainerDied","Data":"3ed746a889a3acf435b0fc46a4e71b8bb2db04afd25d11a2c36fff1bd4270970"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.449431 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-l9hfg" event={"ID":"3a70c14f-d78d-4190-b731-68d3b606796d","Type":"ContainerStarted","Data":"50004e14387c9aeedb9c46337b9cf56e48f53cf883260778378ca82f18afbd30"} Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.457706 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.548429 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-c778-account-create-2slnp"] Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.580769 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cb4f8"] Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.603805 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z6bdb"] Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.615516 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Oct 02 18:39:00 crc kubenswrapper[4909]: I1002 18:39:00.616866 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ftndn"] Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.460519 4909 generic.go:334] "Generic (PLEG): container finished" podID="191d5a8e-d6cd-4538-9dde-2493ed061b25" containerID="ffad82f390602ea39906e575b6f7751d3cdf83e940f8dc88b2189457bb9f295f" exitCode=0 Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.460737 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c778-account-create-2slnp" event={"ID":"191d5a8e-d6cd-4538-9dde-2493ed061b25","Type":"ContainerDied","Data":"ffad82f390602ea39906e575b6f7751d3cdf83e940f8dc88b2189457bb9f295f"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.461059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c778-account-create-2slnp" event={"ID":"191d5a8e-d6cd-4538-9dde-2493ed061b25","Type":"ContainerStarted","Data":"9de8b3b4095cf6ceb5b7fd2b9048b03fbd8460753a7c88f2b9b70141663a6844"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.463982 4909 generic.go:334] "Generic (PLEG): container finished" podID="2616ad31-8b46-48ca-a3db-8e4f54596ae6" containerID="cd8cdd4c042b48dde2686b51942acd02e64bacdd9a2e8fbb69cd0e49d9728949" exitCode=0 Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.464150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z6bdb" event={"ID":"2616ad31-8b46-48ca-a3db-8e4f54596ae6","Type":"ContainerDied","Data":"cd8cdd4c042b48dde2686b51942acd02e64bacdd9a2e8fbb69cd0e49d9728949"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.464395 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z6bdb" event={"ID":"2616ad31-8b46-48ca-a3db-8e4f54596ae6","Type":"ContainerStarted","Data":"ede5b21af8c51cf5cbd166d7f782236ee1d2972efccfa3d0896707abccd649ac"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.476336 4909 generic.go:334] "Generic (PLEG): container finished" podID="2c112c9e-1a94-4ff4-8b92-c712a835851c" containerID="9cbe2e21eeaf63627a116a9e79a150cf7f47ce05a41196f3e8b5dad8d39d30f6" exitCode=0 Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.476461 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ftndn" event={"ID":"2c112c9e-1a94-4ff4-8b92-c712a835851c","Type":"ContainerDied","Data":"9cbe2e21eeaf63627a116a9e79a150cf7f47ce05a41196f3e8b5dad8d39d30f6"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.476511 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ftndn" event={"ID":"2c112c9e-1a94-4ff4-8b92-c712a835851c","Type":"ContainerStarted","Data":"1b3a90b74b83ff5b874e9101f2697caaa76069a3a90314e75bbc04f5cf4ff754"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.488534 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"e4a828df25f8117621360e3af584980ebfd635db58c9c52491ca073445c43e84"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.488578 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"ebe1ff704e9ad3bdd89785215f944beeca060498aafb4ea89b2ac0df5af81973"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.490327 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cb4f8" event={"ID":"9483cd24-6cfe-48cb-a812-8059a42cb41e","Type":"ContainerStarted","Data":"df36d58c667e0296b3cba7b8ad6ba39a193b775148f71cad6f406ece58115453"} Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.532255 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-m25jj"] Oct 02 18:39:01 crc kubenswrapper[4909]: E1002 18:39:01.532781 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9dcfa1-b647-423c-9a3c-0a5f19d133cb" containerName="mariadb-account-create" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.532805 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9dcfa1-b647-423c-9a3c-0a5f19d133cb" containerName="mariadb-account-create" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.533063 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9dcfa1-b647-423c-9a3c-0a5f19d133cb" containerName="mariadb-account-create" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.533833 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.545226 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m25jj"] Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.547252 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-54748" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.548397 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.712475 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-db-sync-config-data\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.712956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pws8\" (UniqueName: \"kubernetes.io/projected/2671433b-366d-48ec-90d4-28a4ed3cece9-kube-api-access-8pws8\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.713084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-combined-ca-bundle\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.713160 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-config-data\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.814403 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-combined-ca-bundle\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.814488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-config-data\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.814511 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-db-sync-config-data\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.814547 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pws8\" (UniqueName: \"kubernetes.io/projected/2671433b-366d-48ec-90d4-28a4ed3cece9-kube-api-access-8pws8\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.824367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-db-sync-config-data\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.835486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-config-data\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.843845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-combined-ca-bundle\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.853047 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pws8\" (UniqueName: \"kubernetes.io/projected/2671433b-366d-48ec-90d4-28a4ed3cece9-kube-api-access-8pws8\") pod \"glance-db-sync-m25jj\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:01 crc kubenswrapper[4909]: I1002 18:39:01.947136 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.044615 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-l9hfg" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.126636 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-krg67" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.222380 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxzf4\" (UniqueName: \"kubernetes.io/projected/3a70c14f-d78d-4190-b731-68d3b606796d-kube-api-access-nxzf4\") pod \"3a70c14f-d78d-4190-b731-68d3b606796d\" (UID: \"3a70c14f-d78d-4190-b731-68d3b606796d\") " Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.227226 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a70c14f-d78d-4190-b731-68d3b606796d-kube-api-access-nxzf4" (OuterVolumeSpecName: "kube-api-access-nxzf4") pod "3a70c14f-d78d-4190-b731-68d3b606796d" (UID: "3a70c14f-d78d-4190-b731-68d3b606796d"). InnerVolumeSpecName "kube-api-access-nxzf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.324260 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxb4t\" (UniqueName: \"kubernetes.io/projected/0965a26d-8f10-4685-89be-50e446307d30-kube-api-access-kxb4t\") pod \"0965a26d-8f10-4685-89be-50e446307d30\" (UID: \"0965a26d-8f10-4685-89be-50e446307d30\") " Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.324768 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxzf4\" (UniqueName: \"kubernetes.io/projected/3a70c14f-d78d-4190-b731-68d3b606796d-kube-api-access-nxzf4\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.327827 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0965a26d-8f10-4685-89be-50e446307d30-kube-api-access-kxb4t" (OuterVolumeSpecName: "kube-api-access-kxb4t") pod "0965a26d-8f10-4685-89be-50e446307d30" (UID: "0965a26d-8f10-4685-89be-50e446307d30"). InnerVolumeSpecName "kube-api-access-kxb4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.427614 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxb4t\" (UniqueName: \"kubernetes.io/projected/0965a26d-8f10-4685-89be-50e446307d30-kube-api-access-kxb4t\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.502533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-krg67" event={"ID":"0965a26d-8f10-4685-89be-50e446307d30","Type":"ContainerDied","Data":"c13c7033a9cd7abe9a3db6689f88dec70aef5be3205faac5f42e7db852790fe0"} Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.502579 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c13c7033a9cd7abe9a3db6689f88dec70aef5be3205faac5f42e7db852790fe0" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.502585 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-krg67" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.512253 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-l9hfg" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.514105 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-l9hfg" event={"ID":"3a70c14f-d78d-4190-b731-68d3b606796d","Type":"ContainerDied","Data":"50004e14387c9aeedb9c46337b9cf56e48f53cf883260778378ca82f18afbd30"} Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.514153 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50004e14387c9aeedb9c46337b9cf56e48f53cf883260778378ca82f18afbd30" Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.514172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m25jj"] Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.642336 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:39:02 crc kubenswrapper[4909]: I1002 18:39:02.879329 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftndn" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.048294 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r78j\" (UniqueName: \"kubernetes.io/projected/2c112c9e-1a94-4ff4-8b92-c712a835851c-kube-api-access-6r78j\") pod \"2c112c9e-1a94-4ff4-8b92-c712a835851c\" (UID: \"2c112c9e-1a94-4ff4-8b92-c712a835851c\") " Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.056934 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c112c9e-1a94-4ff4-8b92-c712a835851c-kube-api-access-6r78j" (OuterVolumeSpecName: "kube-api-access-6r78j") pod "2c112c9e-1a94-4ff4-8b92-c712a835851c" (UID: "2c112c9e-1a94-4ff4-8b92-c712a835851c"). InnerVolumeSpecName "kube-api-access-6r78j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.112046 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.116731 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z6bdb" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.155096 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r78j\" (UniqueName: \"kubernetes.io/projected/2c112c9e-1a94-4ff4-8b92-c712a835851c-kube-api-access-6r78j\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.261864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz8l6\" (UniqueName: \"kubernetes.io/projected/191d5a8e-d6cd-4538-9dde-2493ed061b25-kube-api-access-mz8l6\") pod \"191d5a8e-d6cd-4538-9dde-2493ed061b25\" (UID: \"191d5a8e-d6cd-4538-9dde-2493ed061b25\") " Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.262214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2njq\" (UniqueName: \"kubernetes.io/projected/2616ad31-8b46-48ca-a3db-8e4f54596ae6-kube-api-access-b2njq\") pod \"2616ad31-8b46-48ca-a3db-8e4f54596ae6\" (UID: \"2616ad31-8b46-48ca-a3db-8e4f54596ae6\") " Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.266497 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191d5a8e-d6cd-4538-9dde-2493ed061b25-kube-api-access-mz8l6" (OuterVolumeSpecName: "kube-api-access-mz8l6") pod "191d5a8e-d6cd-4538-9dde-2493ed061b25" (UID: "191d5a8e-d6cd-4538-9dde-2493ed061b25"). InnerVolumeSpecName "kube-api-access-mz8l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.271901 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2616ad31-8b46-48ca-a3db-8e4f54596ae6-kube-api-access-b2njq" (OuterVolumeSpecName: "kube-api-access-b2njq") pod "2616ad31-8b46-48ca-a3db-8e4f54596ae6" (UID: "2616ad31-8b46-48ca-a3db-8e4f54596ae6"). InnerVolumeSpecName "kube-api-access-b2njq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.366322 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz8l6\" (UniqueName: \"kubernetes.io/projected/191d5a8e-d6cd-4538-9dde-2493ed061b25-kube-api-access-mz8l6\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.366360 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2njq\" (UniqueName: \"kubernetes.io/projected/2616ad31-8b46-48ca-a3db-8e4f54596ae6-kube-api-access-b2njq\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.525512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m25jj" event={"ID":"2671433b-366d-48ec-90d4-28a4ed3cece9","Type":"ContainerStarted","Data":"561382fa2cb7a64742bdb5f1430f1f1245670f13ab0e59641ff8213940564a3e"} Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.528116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c778-account-create-2slnp" event={"ID":"191d5a8e-d6cd-4538-9dde-2493ed061b25","Type":"ContainerDied","Data":"9de8b3b4095cf6ceb5b7fd2b9048b03fbd8460753a7c88f2b9b70141663a6844"} Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.529357 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de8b3b4095cf6ceb5b7fd2b9048b03fbd8460753a7c88f2b9b70141663a6844" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.528435 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c778-account-create-2slnp" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.541257 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z6bdb" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.541261 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z6bdb" event={"ID":"2616ad31-8b46-48ca-a3db-8e4f54596ae6","Type":"ContainerDied","Data":"ede5b21af8c51cf5cbd166d7f782236ee1d2972efccfa3d0896707abccd649ac"} Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.541390 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede5b21af8c51cf5cbd166d7f782236ee1d2972efccfa3d0896707abccd649ac" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.543339 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ftndn" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.543333 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ftndn" event={"ID":"2c112c9e-1a94-4ff4-8b92-c712a835851c","Type":"ContainerDied","Data":"1b3a90b74b83ff5b874e9101f2697caaa76069a3a90314e75bbc04f5cf4ff754"} Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.543385 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b3a90b74b83ff5b874e9101f2697caaa76069a3a90314e75bbc04f5cf4ff754" Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.543639 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="config-reloader" containerID="cri-o://1dd406d8ce1e070a752a947917123f68ae98f8d1616c91ddc9f0a04673148e7f" gracePeriod=600 Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.543688 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="thanos-sidecar" containerID="cri-o://46489191a0dfcafaea5edd7abbdae4ba42500ceb3e817660f8e296eb04493bf7" gracePeriod=600 Oct 02 18:39:03 crc kubenswrapper[4909]: I1002 18:39:03.543681 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" containerID="cri-o://1250894f674860f8866f1fd876542b212c0afd7890d8be24e7b495bea10a377c" gracePeriod=600 Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.497080 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.131:9090/-/ready\": dial tcp 10.217.0.131:9090: connect: connection refused" Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.559253 4909 generic.go:334] "Generic (PLEG): container finished" podID="7485bfaa-555b-4469-9681-bc735a109726" containerID="1250894f674860f8866f1fd876542b212c0afd7890d8be24e7b495bea10a377c" exitCode=0 Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.559283 4909 generic.go:334] "Generic (PLEG): container finished" podID="7485bfaa-555b-4469-9681-bc735a109726" containerID="46489191a0dfcafaea5edd7abbdae4ba42500ceb3e817660f8e296eb04493bf7" exitCode=0 Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.559321 4909 generic.go:334] "Generic (PLEG): container finished" podID="7485bfaa-555b-4469-9681-bc735a109726" containerID="1dd406d8ce1e070a752a947917123f68ae98f8d1616c91ddc9f0a04673148e7f" exitCode=0 Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.559343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerDied","Data":"1250894f674860f8866f1fd876542b212c0afd7890d8be24e7b495bea10a377c"} Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.559366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerDied","Data":"46489191a0dfcafaea5edd7abbdae4ba42500ceb3e817660f8e296eb04493bf7"} Oct 02 18:39:04 crc kubenswrapper[4909]: I1002 18:39:04.559400 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerDied","Data":"1dd406d8ce1e070a752a947917123f68ae98f8d1616c91ddc9f0a04673148e7f"} Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.444050 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-3859-account-create-wvjxj"] Oct 02 18:39:08 crc kubenswrapper[4909]: E1002 18:39:08.445172 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0965a26d-8f10-4685-89be-50e446307d30" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445190 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0965a26d-8f10-4685-89be-50e446307d30" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: E1002 18:39:08.445239 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c112c9e-1a94-4ff4-8b92-c712a835851c" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445249 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c112c9e-1a94-4ff4-8b92-c712a835851c" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: E1002 18:39:08.445265 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2616ad31-8b46-48ca-a3db-8e4f54596ae6" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445274 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2616ad31-8b46-48ca-a3db-8e4f54596ae6" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: E1002 18:39:08.445292 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191d5a8e-d6cd-4538-9dde-2493ed061b25" containerName="mariadb-account-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445302 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="191d5a8e-d6cd-4538-9dde-2493ed061b25" containerName="mariadb-account-create" Oct 02 18:39:08 crc kubenswrapper[4909]: E1002 18:39:08.445340 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a70c14f-d78d-4190-b731-68d3b606796d" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445349 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a70c14f-d78d-4190-b731-68d3b606796d" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445575 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a70c14f-d78d-4190-b731-68d3b606796d" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445600 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2616ad31-8b46-48ca-a3db-8e4f54596ae6" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445626 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c112c9e-1a94-4ff4-8b92-c712a835851c" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445650 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="191d5a8e-d6cd-4538-9dde-2493ed061b25" containerName="mariadb-account-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.445667 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0965a26d-8f10-4685-89be-50e446307d30" containerName="mariadb-database-create" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.446546 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.450344 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.469441 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3859-account-create-wvjxj"] Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.548505 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d8c6-account-create-582jx"] Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.549893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.552389 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.556795 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d8c6-account-create-582jx"] Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.574426 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qg6\" (UniqueName: \"kubernetes.io/projected/191bbde0-1bc0-418f-ab35-5abe18caa0b1-kube-api-access-l5qg6\") pod \"heat-3859-account-create-wvjxj\" (UID: \"191bbde0-1bc0-418f-ab35-5abe18caa0b1\") " pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.574522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56q5\" (UniqueName: \"kubernetes.io/projected/5e7fb0ec-f6be-437c-a9c6-98e50de69047-kube-api-access-m56q5\") pod \"cinder-d8c6-account-create-582jx\" (UID: \"5e7fb0ec-f6be-437c-a9c6-98e50de69047\") " pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.676277 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qg6\" (UniqueName: \"kubernetes.io/projected/191bbde0-1bc0-418f-ab35-5abe18caa0b1-kube-api-access-l5qg6\") pod \"heat-3859-account-create-wvjxj\" (UID: \"191bbde0-1bc0-418f-ab35-5abe18caa0b1\") " pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.676460 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56q5\" (UniqueName: \"kubernetes.io/projected/5e7fb0ec-f6be-437c-a9c6-98e50de69047-kube-api-access-m56q5\") pod \"cinder-d8c6-account-create-582jx\" (UID: \"5e7fb0ec-f6be-437c-a9c6-98e50de69047\") " pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.704470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56q5\" (UniqueName: \"kubernetes.io/projected/5e7fb0ec-f6be-437c-a9c6-98e50de69047-kube-api-access-m56q5\") pod \"cinder-d8c6-account-create-582jx\" (UID: \"5e7fb0ec-f6be-437c-a9c6-98e50de69047\") " pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.706212 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qg6\" (UniqueName: \"kubernetes.io/projected/191bbde0-1bc0-418f-ab35-5abe18caa0b1-kube-api-access-l5qg6\") pod \"heat-3859-account-create-wvjxj\" (UID: \"191bbde0-1bc0-418f-ab35-5abe18caa0b1\") " pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.778180 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.873900 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.875691 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.878140 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.879779 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4kdk\" (UniqueName: \"kubernetes.io/projected/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-kube-api-access-d4kdk\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.879859 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-config-data\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.879880 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.882631 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.894492 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.981591 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4kdk\" (UniqueName: \"kubernetes.io/projected/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-kube-api-access-d4kdk\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.981942 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-config-data\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.981966 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.986311 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-config-data\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.986395 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:08 crc kubenswrapper[4909]: I1002 18:39:08.999852 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4kdk\" (UniqueName: \"kubernetes.io/projected/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-kube-api-access-d4kdk\") pod \"mysqld-exporter-0\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " pod="openstack/mysqld-exporter-0" Oct 02 18:39:09 crc kubenswrapper[4909]: I1002 18:39:09.225395 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 02 18:39:09 crc kubenswrapper[4909]: I1002 18:39:09.497473 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.131:9090/-/ready\": dial tcp 10.217.0.131:9090: connect: connection refused" Oct 02 18:39:14 crc kubenswrapper[4909]: I1002 18:39:14.496924 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.131:9090/-/ready\": dial tcp 10.217.0.131:9090: connect: connection refused" Oct 02 18:39:14 crc kubenswrapper[4909]: I1002 18:39:14.497891 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.326562 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.460352 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-web-config\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.460565 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7485bfaa-555b-4469-9681-bc735a109726-config-out\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.460715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-thanos-prometheus-http-client-file\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.460885 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.460964 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-config\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.461096 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkvgm\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-kube-api-access-dkvgm\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.461169 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-tls-assets\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.461252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7485bfaa-555b-4469-9681-bc735a109726-prometheus-metric-storage-rulefiles-0\") pod \"7485bfaa-555b-4469-9681-bc735a109726\" (UID: \"7485bfaa-555b-4469-9681-bc735a109726\") " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.462238 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7485bfaa-555b-4469-9681-bc735a109726-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.467215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-kube-api-access-dkvgm" (OuterVolumeSpecName: "kube-api-access-dkvgm") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "kube-api-access-dkvgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.472688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.475257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7485bfaa-555b-4469-9681-bc735a109726-config-out" (OuterVolumeSpecName: "config-out") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.475260 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.476218 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-config" (OuterVolumeSpecName: "config") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.490799 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "pvc-c46b930e-3186-4c55-b42e-a7116184228b". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.518143 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-web-config" (OuterVolumeSpecName: "web-config") pod "7485bfaa-555b-4469-9681-bc735a109726" (UID: "7485bfaa-555b-4469-9681-bc735a109726"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.534844 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3859-account-create-wvjxj"] Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.546649 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d8c6-account-create-582jx"] Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563481 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkvgm\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-kube-api-access-dkvgm\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563513 4909 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7485bfaa-555b-4469-9681-bc735a109726-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563530 4909 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7485bfaa-555b-4469-9681-bc735a109726-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563542 4909 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-web-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563555 4909 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7485bfaa-555b-4469-9681-bc735a109726-config-out\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563566 4909 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563598 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") on node \"crc\" " Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.563608 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7485bfaa-555b-4469-9681-bc735a109726-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: W1002 18:39:17.563858 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191bbde0_1bc0_418f_ab35_5abe18caa0b1.slice/crio-5d92374f28246f25120a819780b8974317c8a08bfc82b713e0eae9a26c9a4382 WatchSource:0}: Error finding container 5d92374f28246f25120a819780b8974317c8a08bfc82b713e0eae9a26c9a4382: Status 404 returned error can't find the container with id 5d92374f28246f25120a819780b8974317c8a08bfc82b713e0eae9a26c9a4382 Oct 02 18:39:17 crc kubenswrapper[4909]: W1002 18:39:17.574124 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e7fb0ec_f6be_437c_a9c6_98e50de69047.slice/crio-e9252ca617844a02833b56858c0e676ddf124f7429f81cea9575d87d2a8038d7 WatchSource:0}: Error finding container e9252ca617844a02833b56858c0e676ddf124f7429f81cea9575d87d2a8038d7: Status 404 returned error can't find the container with id e9252ca617844a02833b56858c0e676ddf124f7429f81cea9575d87d2a8038d7 Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.585799 4909 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.585960 4909 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c46b930e-3186-4c55-b42e-a7116184228b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b") on node "crc" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.665458 4909 reconciler_common.go:293] "Volume detached for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.745046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3859-account-create-wvjxj" event={"ID":"191bbde0-1bc0-418f-ab35-5abe18caa0b1","Type":"ContainerStarted","Data":"5d92374f28246f25120a819780b8974317c8a08bfc82b713e0eae9a26c9a4382"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.748499 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d8c6-account-create-582jx" event={"ID":"5e7fb0ec-f6be-437c-a9c6-98e50de69047","Type":"ContainerStarted","Data":"e9252ca617844a02833b56858c0e676ddf124f7429f81cea9575d87d2a8038d7"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.752705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7485bfaa-555b-4469-9681-bc735a109726","Type":"ContainerDied","Data":"16734674d76ebd848942e08eb4571dab2bdce400f0ea3183c973a0bfcb062009"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.752742 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.752769 4909 scope.go:117] "RemoveContainer" containerID="1250894f674860f8866f1fd876542b212c0afd7890d8be24e7b495bea10a377c" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.770943 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"3b0082a272f2018c5bf7285eb63d99b0d10e929cadce701dc7a0f4d126eda458"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.770990 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"b2583417428b3581b58909512d148dee023bf6b461c03753f2c18134794d4ee8"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.771000 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"9f5f6a406f39771c4a0b201b4cb3201b5d3104d5592130209d6661c5235df5d2"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.776522 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cb4f8" event={"ID":"9483cd24-6cfe-48cb-a812-8059a42cb41e","Type":"ContainerStarted","Data":"873613d600e989b17372e2117a3734d79f828585e2e6ca21eec4daf69a22c2df"} Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.779176 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.782959 4909 scope.go:117] "RemoveContainer" containerID="46489191a0dfcafaea5edd7abbdae4ba42500ceb3e817660f8e296eb04493bf7" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.788237 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.801314 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.803248 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cb4f8" podStartSLOduration=3.412224166 podStartE2EDuration="19.803231498s" podCreationTimestamp="2025-10-02 18:38:58 +0000 UTC" firstStartedPulling="2025-10-02 18:39:00.601749722 +0000 UTC m=+1261.789245581" lastFinishedPulling="2025-10-02 18:39:16.992757014 +0000 UTC m=+1278.180252913" observedRunningTime="2025-10-02 18:39:17.789629443 +0000 UTC m=+1278.977125302" watchObservedRunningTime="2025-10-02 18:39:17.803231498 +0000 UTC m=+1278.990727357" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.817913 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:39:17 crc kubenswrapper[4909]: E1002 18:39:17.820500 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820531 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" Oct 02 18:39:17 crc kubenswrapper[4909]: E1002 18:39:17.820543 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="thanos-sidecar" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820555 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="thanos-sidecar" Oct 02 18:39:17 crc kubenswrapper[4909]: E1002 18:39:17.820566 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="init-config-reloader" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="init-config-reloader" Oct 02 18:39:17 crc kubenswrapper[4909]: E1002 18:39:17.820589 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="config-reloader" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820595 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="config-reloader" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820834 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="prometheus" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820867 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="thanos-sidecar" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.820884 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7485bfaa-555b-4469-9681-bc735a109726" containerName="config-reloader" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.822656 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.827265 4909 scope.go:117] "RemoveContainer" containerID="1dd406d8ce1e070a752a947917123f68ae98f8d1616c91ddc9f0a04673148e7f" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.827621 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.827651 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.827792 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.827894 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-27q5x" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.828523 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.829824 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.831322 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.837418 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.876601 4909 scope.go:117] "RemoveContainer" containerID="38d1261e38e7874de28fc5e1793400c1aa1f1875912aac6c9c8c3a3976295b43" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.982945 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg48v\" (UniqueName: \"kubernetes.io/projected/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-kube-api-access-pg48v\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983569 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983617 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983792 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.983817 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:17 crc kubenswrapper[4909]: I1002 18:39:17.984241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.085753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086079 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086171 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086205 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.086308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg48v\" (UniqueName: \"kubernetes.io/projected/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-kube-api-access-pg48v\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.087839 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.092671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.093464 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.093679 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.093718 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf86cfcf601ef491894e84c5443d83ba8dd6ded986121e5a4bb1909afcd734cb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.096630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.097359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.098524 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.099480 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.101214 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.101731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.109618 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg48v\" (UniqueName: \"kubernetes.io/projected/bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7-kube-api-access-pg48v\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.138551 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c46b930e-3186-4c55-b42e-a7116184228b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c46b930e-3186-4c55-b42e-a7116184228b\") pod \"prometheus-metric-storage-0\" (UID: \"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7\") " pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.177498 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.401260 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0c9a-account-create-g2zs4"] Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.403103 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.405398 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.409544 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0c9a-account-create-g2zs4"] Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.501242 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247xl\" (UniqueName: \"kubernetes.io/projected/b659a4c7-1d4e-432b-b759-b446761f985c-kube-api-access-247xl\") pod \"barbican-0c9a-account-create-g2zs4\" (UID: \"b659a4c7-1d4e-432b-b759-b446761f985c\") " pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.602732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247xl\" (UniqueName: \"kubernetes.io/projected/b659a4c7-1d4e-432b-b759-b446761f985c-kube-api-access-247xl\") pod \"barbican-0c9a-account-create-g2zs4\" (UID: \"b659a4c7-1d4e-432b-b759-b446761f985c\") " pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.621242 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247xl\" (UniqueName: \"kubernetes.io/projected/b659a4c7-1d4e-432b-b759-b446761f985c-kube-api-access-247xl\") pod \"barbican-0c9a-account-create-g2zs4\" (UID: \"b659a4c7-1d4e-432b-b759-b446761f985c\") " pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.686871 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1af5-account-create-jw8kh"] Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.688288 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.690566 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.701690 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1af5-account-create-jw8kh"] Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.742791 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.743352 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 18:39:18 crc kubenswrapper[4909]: W1002 18:39:18.750235 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcfde8f0_f1c0_41c4_bc7f_8c6c9f7e28a7.slice/crio-9eab85f582e7904b85df87269e3322484a72b1d01472c0e3372aa70c2a6650e2 WatchSource:0}: Error finding container 9eab85f582e7904b85df87269e3322484a72b1d01472c0e3372aa70c2a6650e2: Status 404 returned error can't find the container with id 9eab85f582e7904b85df87269e3322484a72b1d01472c0e3372aa70c2a6650e2 Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.792394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"e6e8d12ca8cc4a4fdfdae3e91a2657b12b50d468b6999ec8303a2f7fc87c00a3"} Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.795926 4909 generic.go:334] "Generic (PLEG): container finished" podID="191bbde0-1bc0-418f-ab35-5abe18caa0b1" containerID="98c22f569ba7eb60269756e1a95dccde274abe4eccbc2eeadfe7366ba54c9e6c" exitCode=0 Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.796128 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3859-account-create-wvjxj" event={"ID":"191bbde0-1bc0-418f-ab35-5abe18caa0b1","Type":"ContainerDied","Data":"98c22f569ba7eb60269756e1a95dccde274abe4eccbc2eeadfe7366ba54c9e6c"} Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.801123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m25jj" event={"ID":"2671433b-366d-48ec-90d4-28a4ed3cece9","Type":"ContainerStarted","Data":"f4d3924233efd779b395c1134fca2e63cb7d8252538b49009c7ae427c4b6b1a1"} Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.803350 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7","Type":"ContainerStarted","Data":"9eab85f582e7904b85df87269e3322484a72b1d01472c0e3372aa70c2a6650e2"} Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.805462 4909 generic.go:334] "Generic (PLEG): container finished" podID="5e7fb0ec-f6be-437c-a9c6-98e50de69047" containerID="b95876ca37579b618c5a7c91d05906697b9b6076ed23fc99d4edf4e68a661678" exitCode=0 Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.805541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d8c6-account-create-582jx" event={"ID":"5e7fb0ec-f6be-437c-a9c6-98e50de69047","Type":"ContainerDied","Data":"b95876ca37579b618c5a7c91d05906697b9b6076ed23fc99d4edf4e68a661678"} Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.805644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5q2\" (UniqueName: \"kubernetes.io/projected/d953baa8-f150-45e6-9949-766017b28d8d-kube-api-access-xl5q2\") pod \"neutron-1af5-account-create-jw8kh\" (UID: \"d953baa8-f150-45e6-9949-766017b28d8d\") " pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.808091 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421","Type":"ContainerStarted","Data":"6d3157829b8e95d79a5b1be6769bfaae889b3f9eba6837d11b05f4e9b6e44a80"} Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.853655 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-m25jj" podStartSLOduration=3.306174772 podStartE2EDuration="17.85363827s" podCreationTimestamp="2025-10-02 18:39:01 +0000 UTC" firstStartedPulling="2025-10-02 18:39:02.540649035 +0000 UTC m=+1263.728144894" lastFinishedPulling="2025-10-02 18:39:17.088112523 +0000 UTC m=+1278.275608392" observedRunningTime="2025-10-02 18:39:18.847060414 +0000 UTC m=+1280.034556303" watchObservedRunningTime="2025-10-02 18:39:18.85363827 +0000 UTC m=+1280.041134129" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.907416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5q2\" (UniqueName: \"kubernetes.io/projected/d953baa8-f150-45e6-9949-766017b28d8d-kube-api-access-xl5q2\") pod \"neutron-1af5-account-create-jw8kh\" (UID: \"d953baa8-f150-45e6-9949-766017b28d8d\") " pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:18 crc kubenswrapper[4909]: I1002 18:39:18.930454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5q2\" (UniqueName: \"kubernetes.io/projected/d953baa8-f150-45e6-9949-766017b28d8d-kube-api-access-xl5q2\") pod \"neutron-1af5-account-create-jw8kh\" (UID: \"d953baa8-f150-45e6-9949-766017b28d8d\") " pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.007190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.297716 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0c9a-account-create-g2zs4"] Oct 02 18:39:19 crc kubenswrapper[4909]: W1002 18:39:19.301275 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb659a4c7_1d4e_432b_b759_b446761f985c.slice/crio-e5859414a0b8561d2e6dfe01d206ae7ca02ed65c0c3afaae26f7acd60bd01565 WatchSource:0}: Error finding container e5859414a0b8561d2e6dfe01d206ae7ca02ed65c0c3afaae26f7acd60bd01565: Status 404 returned error can't find the container with id e5859414a0b8561d2e6dfe01d206ae7ca02ed65c0c3afaae26f7acd60bd01565 Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.461969 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1af5-account-create-jw8kh"] Oct 02 18:39:19 crc kubenswrapper[4909]: W1002 18:39:19.469681 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd953baa8_f150_45e6_9949_766017b28d8d.slice/crio-462562811f50b266ac21c505f1415d69d1565aaa8a8de0260d383c194f0a7fbd WatchSource:0}: Error finding container 462562811f50b266ac21c505f1415d69d1565aaa8a8de0260d383c194f0a7fbd: Status 404 returned error can't find the container with id 462562811f50b266ac21c505f1415d69d1565aaa8a8de0260d383c194f0a7fbd Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.618589 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7485bfaa-555b-4469-9681-bc735a109726" path="/var/lib/kubelet/pods/7485bfaa-555b-4469-9681-bc735a109726/volumes" Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.819617 4909 generic.go:334] "Generic (PLEG): container finished" podID="b659a4c7-1d4e-432b-b759-b446761f985c" containerID="5bef52bd5b082c0b9ddafc5ca237ad00ef1cac67457008a8fd73676187d42abe" exitCode=0 Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.819718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c9a-account-create-g2zs4" event={"ID":"b659a4c7-1d4e-432b-b759-b446761f985c","Type":"ContainerDied","Data":"5bef52bd5b082c0b9ddafc5ca237ad00ef1cac67457008a8fd73676187d42abe"} Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.819780 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c9a-account-create-g2zs4" event={"ID":"b659a4c7-1d4e-432b-b759-b446761f985c","Type":"ContainerStarted","Data":"e5859414a0b8561d2e6dfe01d206ae7ca02ed65c0c3afaae26f7acd60bd01565"} Oct 02 18:39:19 crc kubenswrapper[4909]: I1002 18:39:19.821353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1af5-account-create-jw8kh" event={"ID":"d953baa8-f150-45e6-9949-766017b28d8d","Type":"ContainerStarted","Data":"462562811f50b266ac21c505f1415d69d1565aaa8a8de0260d383c194f0a7fbd"} Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.776330 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.786579 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.842671 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56q5\" (UniqueName: \"kubernetes.io/projected/5e7fb0ec-f6be-437c-a9c6-98e50de69047-kube-api-access-m56q5\") pod \"5e7fb0ec-f6be-437c-a9c6-98e50de69047\" (UID: \"5e7fb0ec-f6be-437c-a9c6-98e50de69047\") " Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.842738 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5qg6\" (UniqueName: \"kubernetes.io/projected/191bbde0-1bc0-418f-ab35-5abe18caa0b1-kube-api-access-l5qg6\") pod \"191bbde0-1bc0-418f-ab35-5abe18caa0b1\" (UID: \"191bbde0-1bc0-418f-ab35-5abe18caa0b1\") " Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.843945 4909 generic.go:334] "Generic (PLEG): container finished" podID="d953baa8-f150-45e6-9949-766017b28d8d" containerID="a80ef33be5c65eafb205abcb4138156b35af41d143261d04494627702ecdb959" exitCode=0 Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.844178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1af5-account-create-jw8kh" event={"ID":"d953baa8-f150-45e6-9949-766017b28d8d","Type":"ContainerDied","Data":"a80ef33be5c65eafb205abcb4138156b35af41d143261d04494627702ecdb959"} Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.846506 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3859-account-create-wvjxj" event={"ID":"191bbde0-1bc0-418f-ab35-5abe18caa0b1","Type":"ContainerDied","Data":"5d92374f28246f25120a819780b8974317c8a08bfc82b713e0eae9a26c9a4382"} Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.846553 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d92374f28246f25120a819780b8974317c8a08bfc82b713e0eae9a26c9a4382" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.846619 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3859-account-create-wvjxj" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.848354 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d8c6-account-create-582jx" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.848440 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d8c6-account-create-582jx" event={"ID":"5e7fb0ec-f6be-437c-a9c6-98e50de69047","Type":"ContainerDied","Data":"e9252ca617844a02833b56858c0e676ddf124f7429f81cea9575d87d2a8038d7"} Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.848464 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9252ca617844a02833b56858c0e676ddf124f7429f81cea9575d87d2a8038d7" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.851483 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e7fb0ec-f6be-437c-a9c6-98e50de69047-kube-api-access-m56q5" (OuterVolumeSpecName: "kube-api-access-m56q5") pod "5e7fb0ec-f6be-437c-a9c6-98e50de69047" (UID: "5e7fb0ec-f6be-437c-a9c6-98e50de69047"). InnerVolumeSpecName "kube-api-access-m56q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.853088 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191bbde0-1bc0-418f-ab35-5abe18caa0b1-kube-api-access-l5qg6" (OuterVolumeSpecName: "kube-api-access-l5qg6") pod "191bbde0-1bc0-418f-ab35-5abe18caa0b1" (UID: "191bbde0-1bc0-418f-ab35-5abe18caa0b1"). InnerVolumeSpecName "kube-api-access-l5qg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.944809 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56q5\" (UniqueName: \"kubernetes.io/projected/5e7fb0ec-f6be-437c-a9c6-98e50de69047-kube-api-access-m56q5\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:20 crc kubenswrapper[4909]: I1002 18:39:20.945099 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5qg6\" (UniqueName: \"kubernetes.io/projected/191bbde0-1bc0-418f-ab35-5abe18caa0b1-kube-api-access-l5qg6\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:21 crc kubenswrapper[4909]: I1002 18:39:21.866473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421","Type":"ContainerStarted","Data":"7ec5e3716d9bee9579242e63236dfe08f677ba2dfc853a376e719350061ba460"} Oct 02 18:39:21 crc kubenswrapper[4909]: I1002 18:39:21.872850 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"49f43377a3d436636fe30e130d3e76c430f5ee5cdbc2b50c867b429cab23682e"} Oct 02 18:39:21 crc kubenswrapper[4909]: I1002 18:39:21.885946 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=10.133061124 podStartE2EDuration="13.885930128s" podCreationTimestamp="2025-10-02 18:39:08 +0000 UTC" firstStartedPulling="2025-10-02 18:39:17.827179056 +0000 UTC m=+1279.014674915" lastFinishedPulling="2025-10-02 18:39:21.58004803 +0000 UTC m=+1282.767543919" observedRunningTime="2025-10-02 18:39:21.884657588 +0000 UTC m=+1283.072153447" watchObservedRunningTime="2025-10-02 18:39:21.885930128 +0000 UTC m=+1283.073425997" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.462815 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.469244 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.588669 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247xl\" (UniqueName: \"kubernetes.io/projected/b659a4c7-1d4e-432b-b759-b446761f985c-kube-api-access-247xl\") pod \"b659a4c7-1d4e-432b-b759-b446761f985c\" (UID: \"b659a4c7-1d4e-432b-b759-b446761f985c\") " Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.588836 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl5q2\" (UniqueName: \"kubernetes.io/projected/d953baa8-f150-45e6-9949-766017b28d8d-kube-api-access-xl5q2\") pod \"d953baa8-f150-45e6-9949-766017b28d8d\" (UID: \"d953baa8-f150-45e6-9949-766017b28d8d\") " Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.599282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b659a4c7-1d4e-432b-b759-b446761f985c-kube-api-access-247xl" (OuterVolumeSpecName: "kube-api-access-247xl") pod "b659a4c7-1d4e-432b-b759-b446761f985c" (UID: "b659a4c7-1d4e-432b-b759-b446761f985c"). InnerVolumeSpecName "kube-api-access-247xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.610581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d953baa8-f150-45e6-9949-766017b28d8d-kube-api-access-xl5q2" (OuterVolumeSpecName: "kube-api-access-xl5q2") pod "d953baa8-f150-45e6-9949-766017b28d8d" (UID: "d953baa8-f150-45e6-9949-766017b28d8d"). InnerVolumeSpecName "kube-api-access-xl5q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.692555 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl5q2\" (UniqueName: \"kubernetes.io/projected/d953baa8-f150-45e6-9949-766017b28d8d-kube-api-access-xl5q2\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.692713 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247xl\" (UniqueName: \"kubernetes.io/projected/b659a4c7-1d4e-432b-b759-b446761f985c-kube-api-access-247xl\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.891868 4909 generic.go:334] "Generic (PLEG): container finished" podID="9483cd24-6cfe-48cb-a812-8059a42cb41e" containerID="873613d600e989b17372e2117a3734d79f828585e2e6ca21eec4daf69a22c2df" exitCode=0 Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.891946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cb4f8" event={"ID":"9483cd24-6cfe-48cb-a812-8059a42cb41e","Type":"ContainerDied","Data":"873613d600e989b17372e2117a3734d79f828585e2e6ca21eec4daf69a22c2df"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.895695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c9a-account-create-g2zs4" event={"ID":"b659a4c7-1d4e-432b-b759-b446761f985c","Type":"ContainerDied","Data":"e5859414a0b8561d2e6dfe01d206ae7ca02ed65c0c3afaae26f7acd60bd01565"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.895740 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5859414a0b8561d2e6dfe01d206ae7ca02ed65c0c3afaae26f7acd60bd01565" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.895750 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c9a-account-create-g2zs4" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.905350 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1af5-account-create-jw8kh" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.905343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1af5-account-create-jw8kh" event={"ID":"d953baa8-f150-45e6-9949-766017b28d8d","Type":"ContainerDied","Data":"462562811f50b266ac21c505f1415d69d1565aaa8a8de0260d383c194f0a7fbd"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.905473 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="462562811f50b266ac21c505f1415d69d1565aaa8a8de0260d383c194f0a7fbd" Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.916960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"02ccf885ca2f5b0422c63fe2f3a7e8b352006b816308da505830a864e02f6f3b"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.917019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"753ac733edfb985bcf1cce14433729f8b4fdca757da5efcab421df8522c4892d"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.917046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"fb7fbb76c539d5e68693d7deaea0b5e1fb2ad1c9d1ef0450ddc4d4714af4d67e"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.917059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"d265432c449283c9c11ecf5d19d197ba3e0b295fe8fba9a80e59fb71840ab35d"} Oct 02 18:39:22 crc kubenswrapper[4909]: I1002 18:39:22.921726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7","Type":"ContainerStarted","Data":"324b86dbc015245016ef098952c910e32a031897915660f8a4b7227c47f5b444"} Oct 02 18:39:23 crc kubenswrapper[4909]: I1002 18:39:23.945958 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"90f33af6d017c7c199db7850eb941193bec16710cdf8f1981ec9514b27dd7932"} Oct 02 18:39:23 crc kubenswrapper[4909]: I1002 18:39:23.946387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc25944b-75d4-4e4e-b1de-57794b5c4bcf","Type":"ContainerStarted","Data":"b142f5db1d6c84fb0134fe166b80cf0c812c303d00e8cd1ec45d62810c25d1da"} Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.018674 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.667988495 podStartE2EDuration="1m1.018648278s" podCreationTimestamp="2025-10-02 18:38:23 +0000 UTC" firstStartedPulling="2025-10-02 18:38:57.158287855 +0000 UTC m=+1258.345783724" lastFinishedPulling="2025-10-02 18:39:21.508947658 +0000 UTC m=+1282.696443507" observedRunningTime="2025-10-02 18:39:24.001407049 +0000 UTC m=+1285.188902948" watchObservedRunningTime="2025-10-02 18:39:24.018648278 +0000 UTC m=+1285.206144177" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.291688 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-7vjx6"] Oct 02 18:39:24 crc kubenswrapper[4909]: E1002 18:39:24.292227 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d953baa8-f150-45e6-9949-766017b28d8d" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292246 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d953baa8-f150-45e6-9949-766017b28d8d" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: E1002 18:39:24.292258 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b659a4c7-1d4e-432b-b759-b446761f985c" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292265 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b659a4c7-1d4e-432b-b759-b446761f985c" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: E1002 18:39:24.292287 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191bbde0-1bc0-418f-ab35-5abe18caa0b1" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292293 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="191bbde0-1bc0-418f-ab35-5abe18caa0b1" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: E1002 18:39:24.292311 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e7fb0ec-f6be-437c-a9c6-98e50de69047" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292317 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7fb0ec-f6be-437c-a9c6-98e50de69047" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292491 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b659a4c7-1d4e-432b-b759-b446761f985c" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292507 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d953baa8-f150-45e6-9949-766017b28d8d" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292522 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="191bbde0-1bc0-418f-ab35-5abe18caa0b1" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.292539 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e7fb0ec-f6be-437c-a9c6-98e50de69047" containerName="mariadb-account-create" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.295795 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.301061 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.317306 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-7vjx6"] Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.343501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqsg\" (UniqueName: \"kubernetes.io/projected/9bb07eb9-8b5c-417f-b160-79a170f933a7-kube-api-access-2qqsg\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.343591 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.343690 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.343764 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.343823 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-config\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.343851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.396477 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445145 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-combined-ca-bundle\") pod \"9483cd24-6cfe-48cb-a812-8059a42cb41e\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445224 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwcdd\" (UniqueName: \"kubernetes.io/projected/9483cd24-6cfe-48cb-a812-8059a42cb41e-kube-api-access-pwcdd\") pod \"9483cd24-6cfe-48cb-a812-8059a42cb41e\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-config-data\") pod \"9483cd24-6cfe-48cb-a812-8059a42cb41e\" (UID: \"9483cd24-6cfe-48cb-a812-8059a42cb41e\") " Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445672 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-config\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445811 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqsg\" (UniqueName: \"kubernetes.io/projected/9bb07eb9-8b5c-417f-b160-79a170f933a7-kube-api-access-2qqsg\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.445858 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.446968 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.447072 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.450944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.450970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-config\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.453151 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.469741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9483cd24-6cfe-48cb-a812-8059a42cb41e-kube-api-access-pwcdd" (OuterVolumeSpecName: "kube-api-access-pwcdd") pod "9483cd24-6cfe-48cb-a812-8059a42cb41e" (UID: "9483cd24-6cfe-48cb-a812-8059a42cb41e"). InnerVolumeSpecName "kube-api-access-pwcdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.473180 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqsg\" (UniqueName: \"kubernetes.io/projected/9bb07eb9-8b5c-417f-b160-79a170f933a7-kube-api-access-2qqsg\") pod \"dnsmasq-dns-77585f5f8c-7vjx6\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.497778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9483cd24-6cfe-48cb-a812-8059a42cb41e" (UID: "9483cd24-6cfe-48cb-a812-8059a42cb41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.516094 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-config-data" (OuterVolumeSpecName: "config-data") pod "9483cd24-6cfe-48cb-a812-8059a42cb41e" (UID: "9483cd24-6cfe-48cb-a812-8059a42cb41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.548157 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.548198 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwcdd\" (UniqueName: \"kubernetes.io/projected/9483cd24-6cfe-48cb-a812-8059a42cb41e-kube-api-access-pwcdd\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.548213 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9483cd24-6cfe-48cb-a812-8059a42cb41e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.696266 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.960926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cb4f8" event={"ID":"9483cd24-6cfe-48cb-a812-8059a42cb41e","Type":"ContainerDied","Data":"df36d58c667e0296b3cba7b8ad6ba39a193b775148f71cad6f406ece58115453"} Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.961308 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df36d58c667e0296b3cba7b8ad6ba39a193b775148f71cad6f406ece58115453" Oct 02 18:39:24 crc kubenswrapper[4909]: I1002 18:39:24.961868 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cb4f8" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.201278 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nkdzc"] Oct 02 18:39:25 crc kubenswrapper[4909]: E1002 18:39:25.201646 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9483cd24-6cfe-48cb-a812-8059a42cb41e" containerName="keystone-db-sync" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.201663 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9483cd24-6cfe-48cb-a812-8059a42cb41e" containerName="keystone-db-sync" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.201876 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9483cd24-6cfe-48cb-a812-8059a42cb41e" containerName="keystone-db-sync" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.202515 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.205955 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.206508 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zsxbj" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.206837 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.208468 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.228528 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-7vjx6"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.237399 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nkdzc"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.259594 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-kr4h4"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.261128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.266977 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-config-data\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267024 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-scripts\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267060 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-fernet-keys\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrgn4\" (UniqueName: \"kubernetes.io/projected/c539da5a-fdae-4562-bcb4-4ec5654823a0-kube-api-access-qrgn4\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267177 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267342 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-svc\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-config\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267403 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-combined-ca-bundle\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267731 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9772b\" (UniqueName: \"kubernetes.io/projected/79281bfe-23eb-4674-8956-c7c007505cac-kube-api-access-9772b\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.267811 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-credential-keys\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.293133 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-kr4h4"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.303073 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-7vjx6"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.341452 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-f8b6m"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.342627 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.346975 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2f54j" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.346988 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.360668 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f8b6m"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-combined-ca-bundle\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-combined-ca-bundle\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369363 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9772b\" (UniqueName: \"kubernetes.io/projected/79281bfe-23eb-4674-8956-c7c007505cac-kube-api-access-9772b\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369409 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-credential-keys\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-config-data\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369470 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-scripts\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-fernet-keys\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrgn4\" (UniqueName: \"kubernetes.io/projected/c539da5a-fdae-4562-bcb4-4ec5654823a0-kube-api-access-qrgn4\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369543 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369560 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-config-data\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369606 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-svc\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kq79\" (UniqueName: \"kubernetes.io/projected/e7f47e10-7e95-4d88-ae97-71b020828eaa-kube-api-access-9kq79\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-config\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.369663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.370496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.375577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.376090 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.378773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-svc\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.379870 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-config\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.390486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-fernet-keys\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.393095 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-credential-keys\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.393323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-scripts\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.397098 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9772b\" (UniqueName: \"kubernetes.io/projected/79281bfe-23eb-4674-8956-c7c007505cac-kube-api-access-9772b\") pod \"dnsmasq-dns-55fff446b9-kr4h4\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.401611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-config-data\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.402808 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrgn4\" (UniqueName: \"kubernetes.io/projected/c539da5a-fdae-4562-bcb4-4ec5654823a0-kube-api-access-qrgn4\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.404439 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-combined-ca-bundle\") pod \"keystone-bootstrap-nkdzc\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.444684 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-k7dmx"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.446144 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.457117 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.457297 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v9vvh" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.457406 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.460628 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-k7dmx"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.476159 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-scripts\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481004 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-config-data\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481095 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbfl\" (UniqueName: \"kubernetes.io/projected/ce4584a9-c2b3-4d6e-976e-131b8b349d79-kube-api-access-7hbfl\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-combined-ca-bundle\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4584a9-c2b3-4d6e-976e-131b8b349d79-etc-machine-id\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481391 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-db-sync-config-data\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481421 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-config-data\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kq79\" (UniqueName: \"kubernetes.io/projected/e7f47e10-7e95-4d88-ae97-71b020828eaa-kube-api-access-9kq79\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.481510 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-combined-ca-bundle\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.491679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-combined-ca-bundle\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.511181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-config-data\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.511249 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-xldr9"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.512363 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.517231 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kq79\" (UniqueName: \"kubernetes.io/projected/e7f47e10-7e95-4d88-ae97-71b020828eaa-kube-api-access-9kq79\") pod \"heat-db-sync-f8b6m\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.520650 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.547884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f8b6m" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.554352 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.554501 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.556511 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xrzr6" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.575475 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.576870 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xldr9"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4584a9-c2b3-4d6e-976e-131b8b349d79-etc-machine-id\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584751 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-db-sync-config-data\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvx5\" (UniqueName: \"kubernetes.io/projected/20be3d30-b225-41d3-a951-790a9b22d89c-kube-api-access-5bvx5\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584828 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-combined-ca-bundle\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-config\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584887 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-scripts\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-config-data\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbfl\" (UniqueName: \"kubernetes.io/projected/ce4584a9-c2b3-4d6e-976e-131b8b349d79-kube-api-access-7hbfl\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.584959 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-combined-ca-bundle\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.585081 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4584a9-c2b3-4d6e-976e-131b8b349d79-etc-machine-id\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.589761 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-db-sync-config-data\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.592457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-combined-ca-bundle\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.599543 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-kr4h4"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.600667 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-scripts\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.601878 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-config-data\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.605948 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-9kwsd"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.609668 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.612451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbfl\" (UniqueName: \"kubernetes.io/projected/ce4584a9-c2b3-4d6e-976e-131b8b349d79-kube-api-access-7hbfl\") pod \"cinder-db-sync-k7dmx\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.627577 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zw6nz"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.629083 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-9kwsd"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.629219 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.632546 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ghgpp" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.632682 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.633488 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.634097 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zw6nz"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.686989 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf2d8c4-630f-4e52-86b5-d20381e90564-logs\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.687076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-scripts\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.687103 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-config\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.687304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj4vv\" (UniqueName: \"kubernetes.io/projected/5bf2d8c4-630f-4e52-86b5-d20381e90564-kube-api-access-fj4vv\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.687496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvx5\" (UniqueName: \"kubernetes.io/projected/20be3d30-b225-41d3-a951-790a9b22d89c-kube-api-access-5bvx5\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.687942 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-combined-ca-bundle\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llpj4\" (UniqueName: \"kubernetes.io/projected/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-kube-api-access-llpj4\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-config-data\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688267 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688325 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-config\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688533 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688586 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.688633 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-combined-ca-bundle\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.693779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-config\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.695869 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-combined-ca-bundle\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.707756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvx5\" (UniqueName: \"kubernetes.io/projected/20be3d30-b225-41d3-a951-790a9b22d89c-kube-api-access-5bvx5\") pod \"neutron-db-sync-xldr9\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.791965 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.792015 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.792771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf2d8c4-630f-4e52-86b5-d20381e90564-logs\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.792801 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-scripts\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.792937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.793002 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.793219 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf2d8c4-630f-4e52-86b5-d20381e90564-logs\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.793312 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-config\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.793415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj4vv\" (UniqueName: \"kubernetes.io/projected/5bf2d8c4-630f-4e52-86b5-d20381e90564-kube-api-access-fj4vv\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.793471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-combined-ca-bundle\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.793502 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llpj4\" (UniqueName: \"kubernetes.io/projected/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-kube-api-access-llpj4\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.794385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-config-data\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.794448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.794478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.794342 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-config\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.795376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.795422 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.797478 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-scripts\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.799544 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-config-data\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.808438 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-combined-ca-bundle\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.821606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj4vv\" (UniqueName: \"kubernetes.io/projected/5bf2d8c4-630f-4e52-86b5-d20381e90564-kube-api-access-fj4vv\") pod \"placement-db-sync-zw6nz\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.833016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llpj4\" (UniqueName: \"kubernetes.io/projected/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-kube-api-access-llpj4\") pod \"dnsmasq-dns-76fcf4b695-9kwsd\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.861834 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.914676 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.919609 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.923870 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.929411 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.929618 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.930525 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.933085 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.967464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zw6nz" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.998813 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-config-data\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.998851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-run-httpd\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.998907 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28xm5\" (UniqueName: \"kubernetes.io/projected/4a5af68b-015f-4b28-a85f-a71b2b000e6a-kube-api-access-28xm5\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.998927 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.998954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.998972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-log-httpd\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:25 crc kubenswrapper[4909]: I1002 18:39:25.999017 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-scripts\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.006624 4909 generic.go:334] "Generic (PLEG): container finished" podID="9bb07eb9-8b5c-417f-b160-79a170f933a7" containerID="233b67568efbcbe1bcb24248360992b68d2a0845dd15286899bf74a6db6a3830" exitCode=0 Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.006667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" event={"ID":"9bb07eb9-8b5c-417f-b160-79a170f933a7","Type":"ContainerDied","Data":"233b67568efbcbe1bcb24248360992b68d2a0845dd15286899bf74a6db6a3830"} Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.006691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" event={"ID":"9bb07eb9-8b5c-417f-b160-79a170f933a7","Type":"ContainerStarted","Data":"b7327b9444fc4901d2469eb21d46b0a18f8647f01527bc91ada4a73aa1bd4e62"} Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-config-data\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102549 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-run-httpd\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102600 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28xm5\" (UniqueName: \"kubernetes.io/projected/4a5af68b-015f-4b28-a85f-a71b2b000e6a-kube-api-access-28xm5\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102667 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-log-httpd\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.102711 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-scripts\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.106670 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-run-httpd\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.106699 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-log-httpd\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.107701 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.113076 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-scripts\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.115797 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-config-data\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.116324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.143770 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28xm5\" (UniqueName: \"kubernetes.io/projected/4a5af68b-015f-4b28-a85f-a71b2b000e6a-kube-api-access-28xm5\") pod \"ceilometer-0\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.153962 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nkdzc"] Oct 02 18:39:26 crc kubenswrapper[4909]: W1002 18:39:26.171397 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc539da5a_fdae_4562_bcb4_4ec5654823a0.slice/crio-afd908bfc6ad85a0be9a8587fbebab62c2df9a05c554a167c8ebb0054e477321 WatchSource:0}: Error finding container afd908bfc6ad85a0be9a8587fbebab62c2df9a05c554a167c8ebb0054e477321: Status 404 returned error can't find the container with id afd908bfc6ad85a0be9a8587fbebab62c2df9a05c554a167c8ebb0054e477321 Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.257241 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.528871 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f8b6m"] Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.553579 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-kr4h4"] Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.657904 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-k7dmx"] Oct 02 18:39:26 crc kubenswrapper[4909]: W1002 18:39:26.728372 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce4584a9_c2b3_4d6e_976e_131b8b349d79.slice/crio-369596de19d5c0db8a519bda3920f1ca4bec019b7dd5eaeb6ccf13df6417a693 WatchSource:0}: Error finding container 369596de19d5c0db8a519bda3920f1ca4bec019b7dd5eaeb6ccf13df6417a693: Status 404 returned error can't find the container with id 369596de19d5c0db8a519bda3920f1ca4bec019b7dd5eaeb6ccf13df6417a693 Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.751785 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.922971 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-sb\") pod \"9bb07eb9-8b5c-417f-b160-79a170f933a7\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.923047 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-svc\") pod \"9bb07eb9-8b5c-417f-b160-79a170f933a7\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.923115 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-nb\") pod \"9bb07eb9-8b5c-417f-b160-79a170f933a7\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.923212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-swift-storage-0\") pod \"9bb07eb9-8b5c-417f-b160-79a170f933a7\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.923239 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-config\") pod \"9bb07eb9-8b5c-417f-b160-79a170f933a7\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.923283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qqsg\" (UniqueName: \"kubernetes.io/projected/9bb07eb9-8b5c-417f-b160-79a170f933a7-kube-api-access-2qqsg\") pod \"9bb07eb9-8b5c-417f-b160-79a170f933a7\" (UID: \"9bb07eb9-8b5c-417f-b160-79a170f933a7\") " Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.928197 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb07eb9-8b5c-417f-b160-79a170f933a7-kube-api-access-2qqsg" (OuterVolumeSpecName: "kube-api-access-2qqsg") pod "9bb07eb9-8b5c-417f-b160-79a170f933a7" (UID: "9bb07eb9-8b5c-417f-b160-79a170f933a7"). InnerVolumeSpecName "kube-api-access-2qqsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.946539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-config" (OuterVolumeSpecName: "config") pod "9bb07eb9-8b5c-417f-b160-79a170f933a7" (UID: "9bb07eb9-8b5c-417f-b160-79a170f933a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.949993 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9bb07eb9-8b5c-417f-b160-79a170f933a7" (UID: "9bb07eb9-8b5c-417f-b160-79a170f933a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.956804 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9bb07eb9-8b5c-417f-b160-79a170f933a7" (UID: "9bb07eb9-8b5c-417f-b160-79a170f933a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.957385 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9bb07eb9-8b5c-417f-b160-79a170f933a7" (UID: "9bb07eb9-8b5c-417f-b160-79a170f933a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:26 crc kubenswrapper[4909]: I1002 18:39:26.967612 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9bb07eb9-8b5c-417f-b160-79a170f933a7" (UID: "9bb07eb9-8b5c-417f-b160-79a170f933a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.030622 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-9kwsd"] Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.031216 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qqsg\" (UniqueName: \"kubernetes.io/projected/9bb07eb9-8b5c-417f-b160-79a170f933a7-kube-api-access-2qqsg\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.033187 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.033260 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.033313 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.033371 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.033423 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb07eb9-8b5c-417f-b160-79a170f933a7-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:27 crc kubenswrapper[4909]: W1002 18:39:27.036293 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b4cb24_4aa2_4311_ac0b_9664ff0ec33b.slice/crio-e0bd53b8f78fb1e078d35c6dfd013113dd7a5bc15d51a761976e28dbe4d42795 WatchSource:0}: Error finding container e0bd53b8f78fb1e078d35c6dfd013113dd7a5bc15d51a761976e28dbe4d42795: Status 404 returned error can't find the container with id e0bd53b8f78fb1e078d35c6dfd013113dd7a5bc15d51a761976e28dbe4d42795 Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.036697 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nkdzc" event={"ID":"c539da5a-fdae-4562-bcb4-4ec5654823a0","Type":"ContainerStarted","Data":"45e0d6c2580e99c6e480e5d0c9b179166fe8a90bf1e2b479963a6c60c1f3ca3e"} Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.036741 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nkdzc" event={"ID":"c539da5a-fdae-4562-bcb4-4ec5654823a0","Type":"ContainerStarted","Data":"afd908bfc6ad85a0be9a8587fbebab62c2df9a05c554a167c8ebb0054e477321"} Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.043693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f8b6m" event={"ID":"e7f47e10-7e95-4d88-ae97-71b020828eaa","Type":"ContainerStarted","Data":"bd01683c35a5552ccba798a84d504f9321324adc43ada5ab439c290cca8e9851"} Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.046282 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7dmx" event={"ID":"ce4584a9-c2b3-4d6e-976e-131b8b349d79","Type":"ContainerStarted","Data":"369596de19d5c0db8a519bda3920f1ca4bec019b7dd5eaeb6ccf13df6417a693"} Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.050604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" event={"ID":"79281bfe-23eb-4674-8956-c7c007505cac","Type":"ContainerStarted","Data":"381ec2c2924c71d6d4b6a2961d8f7d3788b825b2161bf71f4cb59f0284bbbfd7"} Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.061270 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zw6nz"] Oct 02 18:39:27 crc kubenswrapper[4909]: W1002 18:39:27.068877 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bf2d8c4_630f_4e52_86b5_d20381e90564.slice/crio-554c5a5737add0957862a893500ff0f28f4ac2b8cdfaa5a91c1bb5b00df09493 WatchSource:0}: Error finding container 554c5a5737add0957862a893500ff0f28f4ac2b8cdfaa5a91c1bb5b00df09493: Status 404 returned error can't find the container with id 554c5a5737add0957862a893500ff0f28f4ac2b8cdfaa5a91c1bb5b00df09493 Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.068920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xldr9"] Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.070880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" event={"ID":"9bb07eb9-8b5c-417f-b160-79a170f933a7","Type":"ContainerDied","Data":"b7327b9444fc4901d2469eb21d46b0a18f8647f01527bc91ada4a73aa1bd4e62"} Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.070918 4909 scope.go:117] "RemoveContainer" containerID="233b67568efbcbe1bcb24248360992b68d2a0845dd15286899bf74a6db6a3830" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.071084 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-7vjx6" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.071681 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nkdzc" podStartSLOduration=2.071665134 podStartE2EDuration="2.071665134s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:27.064416378 +0000 UTC m=+1288.251912237" watchObservedRunningTime="2025-10-02 18:39:27.071665134 +0000 UTC m=+1288.259160993" Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.092797 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.145143 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-7vjx6"] Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.153922 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-7vjx6"] Oct 02 18:39:27 crc kubenswrapper[4909]: I1002 18:39:27.621294 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb07eb9-8b5c-417f-b160-79a170f933a7" path="/var/lib/kubelet/pods/9bb07eb9-8b5c-417f-b160-79a170f933a7/volumes" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.155315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xldr9" event={"ID":"20be3d30-b225-41d3-a951-790a9b22d89c","Type":"ContainerStarted","Data":"e471edfc1a0940cca9a1b1dba3c96eaa76570a69d08a1be101400119dbeb604f"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.155360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xldr9" event={"ID":"20be3d30-b225-41d3-a951-790a9b22d89c","Type":"ContainerStarted","Data":"5aaad1c76ded8833874f44f06be81ba775730044a8cffdf131bd42d82207483f"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.184597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zw6nz" event={"ID":"5bf2d8c4-630f-4e52-86b5-d20381e90564","Type":"ContainerStarted","Data":"554c5a5737add0957862a893500ff0f28f4ac2b8cdfaa5a91c1bb5b00df09493"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.212051 4909 generic.go:334] "Generic (PLEG): container finished" podID="79281bfe-23eb-4674-8956-c7c007505cac" containerID="e0a1bc307b1534a151be62bc0083968b1b01e76bd373846c7a556c69ad440cf2" exitCode=0 Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.212366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" event={"ID":"79281bfe-23eb-4674-8956-c7c007505cac","Type":"ContainerDied","Data":"e0a1bc307b1534a151be62bc0083968b1b01e76bd373846c7a556c69ad440cf2"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.219933 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-xldr9" podStartSLOduration=3.219917713 podStartE2EDuration="3.219917713s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:28.215151084 +0000 UTC m=+1289.402646943" watchObservedRunningTime="2025-10-02 18:39:28.219917713 +0000 UTC m=+1289.407413572" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.275325 4909 generic.go:334] "Generic (PLEG): container finished" podID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerID="90f0912ef82d705372af384b0b68d71e34c881e06fb003e6dc5b7df612a74129" exitCode=0 Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.275413 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" event={"ID":"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b","Type":"ContainerDied","Data":"90f0912ef82d705372af384b0b68d71e34c881e06fb003e6dc5b7df612a74129"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.275437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" event={"ID":"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b","Type":"ContainerStarted","Data":"e0bd53b8f78fb1e078d35c6dfd013113dd7a5bc15d51a761976e28dbe4d42795"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.287240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerStarted","Data":"c7b09797e2f09efe71758f23d6deba689f9668f9c05a3489c308ff3f9e65303a"} Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.460334 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.729405 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-r4lmb"] Oct 02 18:39:28 crc kubenswrapper[4909]: E1002 18:39:28.730035 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb07eb9-8b5c-417f-b160-79a170f933a7" containerName="init" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.730051 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb07eb9-8b5c-417f-b160-79a170f933a7" containerName="init" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.730254 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb07eb9-8b5c-417f-b160-79a170f933a7" containerName="init" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.730851 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.733273 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vj7rl" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.735709 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.740215 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-r4lmb"] Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.803518 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.868548 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-db-sync-config-data\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.868607 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-combined-ca-bundle\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.868673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5gp\" (UniqueName: \"kubernetes.io/projected/af170421-1a33-4e1d-b1ef-83ac5388791c-kube-api-access-4d5gp\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.970430 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-config\") pod \"79281bfe-23eb-4674-8956-c7c007505cac\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.970493 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-nb\") pod \"79281bfe-23eb-4674-8956-c7c007505cac\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.970522 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-sb\") pod \"79281bfe-23eb-4674-8956-c7c007505cac\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.970803 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9772b\" (UniqueName: \"kubernetes.io/projected/79281bfe-23eb-4674-8956-c7c007505cac-kube-api-access-9772b\") pod \"79281bfe-23eb-4674-8956-c7c007505cac\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.970880 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-svc\") pod \"79281bfe-23eb-4674-8956-c7c007505cac\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.970933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-swift-storage-0\") pod \"79281bfe-23eb-4674-8956-c7c007505cac\" (UID: \"79281bfe-23eb-4674-8956-c7c007505cac\") " Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.971200 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5gp\" (UniqueName: \"kubernetes.io/projected/af170421-1a33-4e1d-b1ef-83ac5388791c-kube-api-access-4d5gp\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.971327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-db-sync-config-data\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.971369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-combined-ca-bundle\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.980577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-combined-ca-bundle\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.987232 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79281bfe-23eb-4674-8956-c7c007505cac-kube-api-access-9772b" (OuterVolumeSpecName: "kube-api-access-9772b") pod "79281bfe-23eb-4674-8956-c7c007505cac" (UID: "79281bfe-23eb-4674-8956-c7c007505cac"). InnerVolumeSpecName "kube-api-access-9772b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:28 crc kubenswrapper[4909]: I1002 18:39:28.997643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-db-sync-config-data\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.005344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5gp\" (UniqueName: \"kubernetes.io/projected/af170421-1a33-4e1d-b1ef-83ac5388791c-kube-api-access-4d5gp\") pod \"barbican-db-sync-r4lmb\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.018310 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79281bfe-23eb-4674-8956-c7c007505cac" (UID: "79281bfe-23eb-4674-8956-c7c007505cac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.019695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79281bfe-23eb-4674-8956-c7c007505cac" (UID: "79281bfe-23eb-4674-8956-c7c007505cac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.022190 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-config" (OuterVolumeSpecName: "config") pod "79281bfe-23eb-4674-8956-c7c007505cac" (UID: "79281bfe-23eb-4674-8956-c7c007505cac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.026751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79281bfe-23eb-4674-8956-c7c007505cac" (UID: "79281bfe-23eb-4674-8956-c7c007505cac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.027559 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "79281bfe-23eb-4674-8956-c7c007505cac" (UID: "79281bfe-23eb-4674-8956-c7c007505cac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.052454 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.073554 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9772b\" (UniqueName: \"kubernetes.io/projected/79281bfe-23eb-4674-8956-c7c007505cac-kube-api-access-9772b\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.073805 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.073816 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.073825 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.073833 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.073840 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79281bfe-23eb-4674-8956-c7c007505cac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.305497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" event={"ID":"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b","Type":"ContainerStarted","Data":"6cf01c83ac2ad4c8461040d5fb82d2fdf259acd2f9b842d337c2652f0f38d006"} Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.306685 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.317606 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.321146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-kr4h4" event={"ID":"79281bfe-23eb-4674-8956-c7c007505cac","Type":"ContainerDied","Data":"381ec2c2924c71d6d4b6a2961d8f7d3788b825b2161bf71f4cb59f0284bbbfd7"} Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.321217 4909 scope.go:117] "RemoveContainer" containerID="e0a1bc307b1534a151be62bc0083968b1b01e76bd373846c7a556c69ad440cf2" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.334564 4909 generic.go:334] "Generic (PLEG): container finished" podID="bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7" containerID="324b86dbc015245016ef098952c910e32a031897915660f8a4b7227c47f5b444" exitCode=0 Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.335367 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7","Type":"ContainerDied","Data":"324b86dbc015245016ef098952c910e32a031897915660f8a4b7227c47f5b444"} Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.352551 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" podStartSLOduration=4.352529964 podStartE2EDuration="4.352529964s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:29.326450878 +0000 UTC m=+1290.513946737" watchObservedRunningTime="2025-10-02 18:39:29.352529964 +0000 UTC m=+1290.540025823" Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.567776 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-r4lmb"] Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.662486 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-kr4h4"] Oct 02 18:39:29 crc kubenswrapper[4909]: I1002 18:39:29.682828 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-kr4h4"] Oct 02 18:39:30 crc kubenswrapper[4909]: I1002 18:39:30.359257 4909 generic.go:334] "Generic (PLEG): container finished" podID="2671433b-366d-48ec-90d4-28a4ed3cece9" containerID="f4d3924233efd779b395c1134fca2e63cb7d8252538b49009c7ae427c4b6b1a1" exitCode=0 Oct 02 18:39:30 crc kubenswrapper[4909]: I1002 18:39:30.360276 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m25jj" event={"ID":"2671433b-366d-48ec-90d4-28a4ed3cece9","Type":"ContainerDied","Data":"f4d3924233efd779b395c1134fca2e63cb7d8252538b49009c7ae427c4b6b1a1"} Oct 02 18:39:30 crc kubenswrapper[4909]: I1002 18:39:30.371545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7","Type":"ContainerStarted","Data":"67344e108152cc0476a47dcc038ad2e8d5f9d1c34ff62143722a6839555b7bad"} Oct 02 18:39:30 crc kubenswrapper[4909]: I1002 18:39:30.380001 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r4lmb" event={"ID":"af170421-1a33-4e1d-b1ef-83ac5388791c","Type":"ContainerStarted","Data":"f7e0f0fe46b51391dd10e3a36c7067d3a7c2d886b21af2880fd1d88607d39adb"} Oct 02 18:39:31 crc kubenswrapper[4909]: I1002 18:39:31.402591 4909 generic.go:334] "Generic (PLEG): container finished" podID="c539da5a-fdae-4562-bcb4-4ec5654823a0" containerID="45e0d6c2580e99c6e480e5d0c9b179166fe8a90bf1e2b479963a6c60c1f3ca3e" exitCode=0 Oct 02 18:39:31 crc kubenswrapper[4909]: I1002 18:39:31.402704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nkdzc" event={"ID":"c539da5a-fdae-4562-bcb4-4ec5654823a0","Type":"ContainerDied","Data":"45e0d6c2580e99c6e480e5d0c9b179166fe8a90bf1e2b479963a6c60c1f3ca3e"} Oct 02 18:39:31 crc kubenswrapper[4909]: I1002 18:39:31.634576 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79281bfe-23eb-4674-8956-c7c007505cac" path="/var/lib/kubelet/pods/79281bfe-23eb-4674-8956-c7c007505cac/volumes" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.073925 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.270564 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-fernet-keys\") pod \"c539da5a-fdae-4562-bcb4-4ec5654823a0\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.270630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrgn4\" (UniqueName: \"kubernetes.io/projected/c539da5a-fdae-4562-bcb4-4ec5654823a0-kube-api-access-qrgn4\") pod \"c539da5a-fdae-4562-bcb4-4ec5654823a0\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.270787 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-config-data\") pod \"c539da5a-fdae-4562-bcb4-4ec5654823a0\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.270808 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-credential-keys\") pod \"c539da5a-fdae-4562-bcb4-4ec5654823a0\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.270829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-combined-ca-bundle\") pod \"c539da5a-fdae-4562-bcb4-4ec5654823a0\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.270919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-scripts\") pod \"c539da5a-fdae-4562-bcb4-4ec5654823a0\" (UID: \"c539da5a-fdae-4562-bcb4-4ec5654823a0\") " Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.277272 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c539da5a-fdae-4562-bcb4-4ec5654823a0" (UID: "c539da5a-fdae-4562-bcb4-4ec5654823a0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.277296 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c539da5a-fdae-4562-bcb4-4ec5654823a0-kube-api-access-qrgn4" (OuterVolumeSpecName: "kube-api-access-qrgn4") pod "c539da5a-fdae-4562-bcb4-4ec5654823a0" (UID: "c539da5a-fdae-4562-bcb4-4ec5654823a0"). InnerVolumeSpecName "kube-api-access-qrgn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.279532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-scripts" (OuterVolumeSpecName: "scripts") pod "c539da5a-fdae-4562-bcb4-4ec5654823a0" (UID: "c539da5a-fdae-4562-bcb4-4ec5654823a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.280939 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c539da5a-fdae-4562-bcb4-4ec5654823a0" (UID: "c539da5a-fdae-4562-bcb4-4ec5654823a0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.304978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-config-data" (OuterVolumeSpecName: "config-data") pod "c539da5a-fdae-4562-bcb4-4ec5654823a0" (UID: "c539da5a-fdae-4562-bcb4-4ec5654823a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.322579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c539da5a-fdae-4562-bcb4-4ec5654823a0" (UID: "c539da5a-fdae-4562-bcb4-4ec5654823a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.373175 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.373213 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.373227 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.373236 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.373246 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c539da5a-fdae-4562-bcb4-4ec5654823a0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.373254 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrgn4\" (UniqueName: \"kubernetes.io/projected/c539da5a-fdae-4562-bcb4-4ec5654823a0-kube-api-access-qrgn4\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.469361 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7","Type":"ContainerStarted","Data":"8ef5366fabc3f08ba9180ea37ccf1d160db1d82abeff1e9dc7150b73db20847e"} Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.486505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nkdzc" event={"ID":"c539da5a-fdae-4562-bcb4-4ec5654823a0","Type":"ContainerDied","Data":"afd908bfc6ad85a0be9a8587fbebab62c2df9a05c554a167c8ebb0054e477321"} Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.486550 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd908bfc6ad85a0be9a8587fbebab62c2df9a05c554a167c8ebb0054e477321" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.486558 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nkdzc" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.525684 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nkdzc"] Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.532560 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nkdzc"] Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.627344 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c539da5a-fdae-4562-bcb4-4ec5654823a0" path="/var/lib/kubelet/pods/c539da5a-fdae-4562-bcb4-4ec5654823a0/volumes" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.628126 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rcr26"] Oct 02 18:39:33 crc kubenswrapper[4909]: E1002 18:39:33.628559 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c539da5a-fdae-4562-bcb4-4ec5654823a0" containerName="keystone-bootstrap" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.628582 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c539da5a-fdae-4562-bcb4-4ec5654823a0" containerName="keystone-bootstrap" Oct 02 18:39:33 crc kubenswrapper[4909]: E1002 18:39:33.628631 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79281bfe-23eb-4674-8956-c7c007505cac" containerName="init" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.628640 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79281bfe-23eb-4674-8956-c7c007505cac" containerName="init" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.628955 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c539da5a-fdae-4562-bcb4-4ec5654823a0" containerName="keystone-bootstrap" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.628981 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="79281bfe-23eb-4674-8956-c7c007505cac" containerName="init" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.630248 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.632531 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.632750 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.632880 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zsxbj" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.633231 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.634313 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rcr26"] Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.782550 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-credential-keys\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.782616 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-combined-ca-bundle\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.782714 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-fernet-keys\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.782807 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-config-data\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.782877 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-scripts\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.782930 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzg8x\" (UniqueName: \"kubernetes.io/projected/4e106d88-ec64-496f-b8a4-ca8137353399-kube-api-access-fzg8x\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.884738 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-combined-ca-bundle\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.884843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-fernet-keys\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.884939 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-config-data\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.884991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-scripts\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.885050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzg8x\" (UniqueName: \"kubernetes.io/projected/4e106d88-ec64-496f-b8a4-ca8137353399-kube-api-access-fzg8x\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.885085 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-credential-keys\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.890460 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-combined-ca-bundle\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.890907 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-fernet-keys\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.891273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-scripts\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.891888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-config-data\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.893483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-credential-keys\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.907647 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzg8x\" (UniqueName: \"kubernetes.io/projected/4e106d88-ec64-496f-b8a4-ca8137353399-kube-api-access-fzg8x\") pod \"keystone-bootstrap-rcr26\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:33 crc kubenswrapper[4909]: I1002 18:39:33.950470 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:39:35 crc kubenswrapper[4909]: I1002 18:39:35.932222 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:39:35 crc kubenswrapper[4909]: I1002 18:39:35.997288 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j28mh"] Oct 02 18:39:35 crc kubenswrapper[4909]: I1002 18:39:35.997648 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-j28mh" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" containerID="cri-o://0e13f289e4db459670b65411637693512c793cd459cbfe0ac5cf0f258403c830" gracePeriod=10 Oct 02 18:39:36 crc kubenswrapper[4909]: I1002 18:39:36.524080 4909 generic.go:334] "Generic (PLEG): container finished" podID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerID="0e13f289e4db459670b65411637693512c793cd459cbfe0ac5cf0f258403c830" exitCode=0 Oct 02 18:39:36 crc kubenswrapper[4909]: I1002 18:39:36.524128 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j28mh" event={"ID":"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1","Type":"ContainerDied","Data":"0e13f289e4db459670b65411637693512c793cd459cbfe0ac5cf0f258403c830"} Oct 02 18:39:38 crc kubenswrapper[4909]: I1002 18:39:38.530347 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-j28mh" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.555462 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m25jj" event={"ID":"2671433b-366d-48ec-90d4-28a4ed3cece9","Type":"ContainerDied","Data":"561382fa2cb7a64742bdb5f1430f1f1245670f13ab0e59641ff8213940564a3e"} Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.555759 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="561382fa2cb7a64742bdb5f1430f1f1245670f13ab0e59641ff8213940564a3e" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.581676 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.715865 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-combined-ca-bundle\") pod \"2671433b-366d-48ec-90d4-28a4ed3cece9\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.715940 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-config-data\") pod \"2671433b-366d-48ec-90d4-28a4ed3cece9\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.716356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pws8\" (UniqueName: \"kubernetes.io/projected/2671433b-366d-48ec-90d4-28a4ed3cece9-kube-api-access-8pws8\") pod \"2671433b-366d-48ec-90d4-28a4ed3cece9\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.716432 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-db-sync-config-data\") pod \"2671433b-366d-48ec-90d4-28a4ed3cece9\" (UID: \"2671433b-366d-48ec-90d4-28a4ed3cece9\") " Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.725376 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2671433b-366d-48ec-90d4-28a4ed3cece9" (UID: "2671433b-366d-48ec-90d4-28a4ed3cece9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.725408 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2671433b-366d-48ec-90d4-28a4ed3cece9-kube-api-access-8pws8" (OuterVolumeSpecName: "kube-api-access-8pws8") pod "2671433b-366d-48ec-90d4-28a4ed3cece9" (UID: "2671433b-366d-48ec-90d4-28a4ed3cece9"). InnerVolumeSpecName "kube-api-access-8pws8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.747737 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2671433b-366d-48ec-90d4-28a4ed3cece9" (UID: "2671433b-366d-48ec-90d4-28a4ed3cece9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.799222 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-config-data" (OuterVolumeSpecName: "config-data") pod "2671433b-366d-48ec-90d4-28a4ed3cece9" (UID: "2671433b-366d-48ec-90d4-28a4ed3cece9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.819347 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pws8\" (UniqueName: \"kubernetes.io/projected/2671433b-366d-48ec-90d4-28a4ed3cece9-kube-api-access-8pws8\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.819387 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.819397 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:39 crc kubenswrapper[4909]: I1002 18:39:39.819406 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2671433b-366d-48ec-90d4-28a4ed3cece9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:40 crc kubenswrapper[4909]: I1002 18:39:40.565880 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m25jj" Oct 02 18:39:40 crc kubenswrapper[4909]: I1002 18:39:40.969203 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-b6z29"] Oct 02 18:39:40 crc kubenswrapper[4909]: E1002 18:39:40.970155 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2671433b-366d-48ec-90d4-28a4ed3cece9" containerName="glance-db-sync" Oct 02 18:39:40 crc kubenswrapper[4909]: I1002 18:39:40.970168 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2671433b-366d-48ec-90d4-28a4ed3cece9" containerName="glance-db-sync" Oct 02 18:39:40 crc kubenswrapper[4909]: I1002 18:39:40.970369 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2671433b-366d-48ec-90d4-28a4ed3cece9" containerName="glance-db-sync" Oct 02 18:39:40 crc kubenswrapper[4909]: I1002 18:39:40.971390 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:40 crc kubenswrapper[4909]: I1002 18:39:40.992177 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-b6z29"] Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.147232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.147341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.147367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.147394 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5c2\" (UniqueName: \"kubernetes.io/projected/21ea0645-5a72-4be7-b21b-d6abdfd9b351-kube-api-access-hc5c2\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.147418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-config\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.147433 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.248486 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.248592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.248621 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.248650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5c2\" (UniqueName: \"kubernetes.io/projected/21ea0645-5a72-4be7-b21b-d6abdfd9b351-kube-api-access-hc5c2\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.248676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-config\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.248690 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.249737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.249903 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.250001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-config\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.250103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.250242 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.264778 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5c2\" (UniqueName: \"kubernetes.io/projected/21ea0645-5a72-4be7-b21b-d6abdfd9b351-kube-api-access-hc5c2\") pod \"dnsmasq-dns-8b5c85b87-b6z29\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:41 crc kubenswrapper[4909]: I1002 18:39:41.289063 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:48 crc kubenswrapper[4909]: I1002 18:39:48.530432 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-j28mh" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: i/o timeout" Oct 02 18:39:48 crc kubenswrapper[4909]: I1002 18:39:48.669360 4909 generic.go:334] "Generic (PLEG): container finished" podID="20be3d30-b225-41d3-a951-790a9b22d89c" containerID="e471edfc1a0940cca9a1b1dba3c96eaa76570a69d08a1be101400119dbeb604f" exitCode=0 Oct 02 18:39:48 crc kubenswrapper[4909]: I1002 18:39:48.669432 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xldr9" event={"ID":"20be3d30-b225-41d3-a951-790a9b22d89c","Type":"ContainerDied","Data":"e471edfc1a0940cca9a1b1dba3c96eaa76570a69d08a1be101400119dbeb604f"} Oct 02 18:39:48 crc kubenswrapper[4909]: I1002 18:39:48.832065 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.008530 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-config\") pod \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.008578 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f56fh\" (UniqueName: \"kubernetes.io/projected/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-kube-api-access-f56fh\") pod \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.008650 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-nb\") pod \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.008688 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-dns-svc\") pod \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.008848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-sb\") pod \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\" (UID: \"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1\") " Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.016258 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-kube-api-access-f56fh" (OuterVolumeSpecName: "kube-api-access-f56fh") pod "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" (UID: "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1"). InnerVolumeSpecName "kube-api-access-f56fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.060887 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" (UID: "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.068716 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" (UID: "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.079712 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-config" (OuterVolumeSpecName: "config") pod "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" (UID: "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.091963 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" (UID: "bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.111620 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.111877 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.111940 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f56fh\" (UniqueName: \"kubernetes.io/projected/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-kube-api-access-f56fh\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.112004 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.112074 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.683123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j28mh" event={"ID":"bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1","Type":"ContainerDied","Data":"83dac0a0f2e20cdfba174a39cc6840d921fadade571bd57ae2f829139276b58e"} Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.683285 4909 scope.go:117] "RemoveContainer" containerID="0e13f289e4db459670b65411637693512c793cd459cbfe0ac5cf0f258403c830" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.683736 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j28mh" Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.729763 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j28mh"] Oct 02 18:39:49 crc kubenswrapper[4909]: I1002 18:39:49.741218 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j28mh"] Oct 02 18:39:51 crc kubenswrapper[4909]: E1002 18:39:51.095165 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 02 18:39:51 crc kubenswrapper[4909]: E1002 18:39:51.095627 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7hbfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-k7dmx_openstack(ce4584a9-c2b3-4d6e-976e-131b8b349d79): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:39:51 crc kubenswrapper[4909]: E1002 18:39:51.099104 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-k7dmx" podUID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" Oct 02 18:39:51 crc kubenswrapper[4909]: I1002 18:39:51.629005 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" path="/var/lib/kubelet/pods/bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1/volumes" Oct 02 18:39:51 crc kubenswrapper[4909]: E1002 18:39:51.706428 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-k7dmx" podUID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.134216 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.219316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-combined-ca-bundle\") pod \"20be3d30-b225-41d3-a951-790a9b22d89c\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.219366 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bvx5\" (UniqueName: \"kubernetes.io/projected/20be3d30-b225-41d3-a951-790a9b22d89c-kube-api-access-5bvx5\") pod \"20be3d30-b225-41d3-a951-790a9b22d89c\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.219527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-config\") pod \"20be3d30-b225-41d3-a951-790a9b22d89c\" (UID: \"20be3d30-b225-41d3-a951-790a9b22d89c\") " Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.234310 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20be3d30-b225-41d3-a951-790a9b22d89c-kube-api-access-5bvx5" (OuterVolumeSpecName: "kube-api-access-5bvx5") pod "20be3d30-b225-41d3-a951-790a9b22d89c" (UID: "20be3d30-b225-41d3-a951-790a9b22d89c"). InnerVolumeSpecName "kube-api-access-5bvx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.245882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-config" (OuterVolumeSpecName: "config") pod "20be3d30-b225-41d3-a951-790a9b22d89c" (UID: "20be3d30-b225-41d3-a951-790a9b22d89c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.258222 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20be3d30-b225-41d3-a951-790a9b22d89c" (UID: "20be3d30-b225-41d3-a951-790a9b22d89c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.323403 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.323434 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20be3d30-b225-41d3-a951-790a9b22d89c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.323447 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bvx5\" (UniqueName: \"kubernetes.io/projected/20be3d30-b225-41d3-a951-790a9b22d89c-kube-api-access-5bvx5\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.532221 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-j28mh" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: i/o timeout" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.727014 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xldr9" event={"ID":"20be3d30-b225-41d3-a951-790a9b22d89c","Type":"ContainerDied","Data":"5aaad1c76ded8833874f44f06be81ba775730044a8cffdf131bd42d82207483f"} Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.727079 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aaad1c76ded8833874f44f06be81ba775730044a8cffdf131bd42d82207483f" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.727132 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xldr9" Oct 02 18:39:53 crc kubenswrapper[4909]: E1002 18:39:53.789300 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 02 18:39:53 crc kubenswrapper[4909]: E1002 18:39:53.789475 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4d5gp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-r4lmb_openstack(af170421-1a33-4e1d-b1ef-83ac5388791c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:39:53 crc kubenswrapper[4909]: E1002 18:39:53.790704 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-r4lmb" podUID="af170421-1a33-4e1d-b1ef-83ac5388791c" Oct 02 18:39:53 crc kubenswrapper[4909]: I1002 18:39:53.813072 4909 scope.go:117] "RemoveContainer" containerID="05a901e70c829631a65398a5400d201062d524ec9ab9666de970be3a1df6795f" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.346106 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rcr26"] Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.492695 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-b6z29"] Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.566641 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-b6z29"] Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.592779 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6nhsg"] Oct 02 18:39:54 crc kubenswrapper[4909]: E1002 18:39:54.593219 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.593240 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" Oct 02 18:39:54 crc kubenswrapper[4909]: E1002 18:39:54.593252 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="init" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.593258 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="init" Oct 02 18:39:54 crc kubenswrapper[4909]: E1002 18:39:54.593334 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20be3d30-b225-41d3-a951-790a9b22d89c" containerName="neutron-db-sync" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.593343 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20be3d30-b225-41d3-a951-790a9b22d89c" containerName="neutron-db-sync" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.593570 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbea14e5-7c6a-45e8-9f93-598e3fdfa9b1" containerName="dnsmasq-dns" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.593594 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20be3d30-b225-41d3-a951-790a9b22d89c" containerName="neutron-db-sync" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.596506 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.631955 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fdf9896db-xbknn"] Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.633451 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.641546 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xrzr6" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.642018 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.642156 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.642539 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.655171 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fdf9896db-xbknn"] Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684184 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-config\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-httpd-config\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd5gj\" (UniqueName: \"kubernetes.io/projected/cf0443bc-15f9-47c0-9105-a1acb3ff6998-kube-api-access-wd5gj\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684347 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrhx\" (UniqueName: \"kubernetes.io/projected/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-kube-api-access-mfrhx\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684381 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-ovndb-tls-certs\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684429 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-combined-ca-bundle\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684500 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-config\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684585 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.684604 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.686604 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6nhsg"] Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.748328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zw6nz" event={"ID":"5bf2d8c4-630f-4e52-86b5-d20381e90564","Type":"ContainerStarted","Data":"3ad6fdc0d4c617c974619894478fecf07fd6e1e0f7dcf6f29cfd95062fcf3313"} Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.750326 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f8b6m" event={"ID":"e7f47e10-7e95-4d88-ae97-71b020828eaa","Type":"ContainerStarted","Data":"a4886512dd74e4bf92e75a07d365db5432c367595ddfd519be9f05f6fc8878b8"} Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.755114 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7","Type":"ContainerStarted","Data":"9de2b88205cf66f2a4413123a02047c669c746e9f7f18a1357f523c65fc126ad"} Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.757324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcr26" event={"ID":"4e106d88-ec64-496f-b8a4-ca8137353399","Type":"ContainerStarted","Data":"de0e11e5ab4efe4d2925fe8200afb644af4140991c8ad1137253fa265bc7b920"} Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.757363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcr26" event={"ID":"4e106d88-ec64-496f-b8a4-ca8137353399","Type":"ContainerStarted","Data":"14205984170c742f0b8fca70cce529d3797101b1673a1dfec9c08a1aff4ae116"} Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.765365 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zw6nz" podStartSLOduration=3.812740847 podStartE2EDuration="29.765350364s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="2025-10-02 18:39:27.074804882 +0000 UTC m=+1288.262300741" lastFinishedPulling="2025-10-02 18:39:53.027414389 +0000 UTC m=+1314.214910258" observedRunningTime="2025-10-02 18:39:54.762447074 +0000 UTC m=+1315.949942933" watchObservedRunningTime="2025-10-02 18:39:54.765350364 +0000 UTC m=+1315.952846223" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.780349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" event={"ID":"21ea0645-5a72-4be7-b21b-d6abdfd9b351","Type":"ContainerStarted","Data":"456fc7ec695d1af3111166af849706d1727727d8cd74a3df6d1a9bf035dab740"} Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.785842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-config\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793512 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-httpd-config\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd5gj\" (UniqueName: \"kubernetes.io/projected/cf0443bc-15f9-47c0-9105-a1acb3ff6998-kube-api-access-wd5gj\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793833 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrhx\" (UniqueName: \"kubernetes.io/projected/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-kube-api-access-mfrhx\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793916 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.793994 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-ovndb-tls-certs\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.794110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-combined-ca-bundle\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.794437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-config\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.787614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.799177 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.800249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.800298 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-config\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.800342 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.800452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerStarted","Data":"7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a"} Oct 02 18:39:54 crc kubenswrapper[4909]: E1002 18:39:54.801385 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-r4lmb" podUID="af170421-1a33-4e1d-b1ef-83ac5388791c" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.808801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-ovndb-tls-certs\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.816705 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-f8b6m" podStartSLOduration=3.324497391 podStartE2EDuration="29.816682898s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="2025-10-02 18:39:26.54124342 +0000 UTC m=+1287.728739279" lastFinishedPulling="2025-10-02 18:39:53.033428937 +0000 UTC m=+1314.220924786" observedRunningTime="2025-10-02 18:39:54.779447265 +0000 UTC m=+1315.966943134" watchObservedRunningTime="2025-10-02 18:39:54.816682898 +0000 UTC m=+1316.004178757" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.816896 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-httpd-config\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.818663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-combined-ca-bundle\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.819424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-config\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.820097 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rcr26" podStartSLOduration=21.820083385 podStartE2EDuration="21.820083385s" podCreationTimestamp="2025-10-02 18:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:54.800189613 +0000 UTC m=+1315.987685462" watchObservedRunningTime="2025-10-02 18:39:54.820083385 +0000 UTC m=+1316.007579234" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.828023 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd5gj\" (UniqueName: \"kubernetes.io/projected/cf0443bc-15f9-47c0-9105-a1acb3ff6998-kube-api-access-wd5gj\") pod \"dnsmasq-dns-84b966f6c9-6nhsg\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.828545 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrhx\" (UniqueName: \"kubernetes.io/projected/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-kube-api-access-mfrhx\") pod \"neutron-fdf9896db-xbknn\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.847282 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=37.847265114 podStartE2EDuration="37.847265114s" podCreationTimestamp="2025-10-02 18:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:54.824922126 +0000 UTC m=+1316.012417985" watchObservedRunningTime="2025-10-02 18:39:54.847265114 +0000 UTC m=+1316.034760973" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.958663 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:54 crc kubenswrapper[4909]: I1002 18:39:54.979074 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:55 crc kubenswrapper[4909]: I1002 18:39:55.657064 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fdf9896db-xbknn"] Oct 02 18:39:55 crc kubenswrapper[4909]: I1002 18:39:55.680807 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6nhsg"] Oct 02 18:39:55 crc kubenswrapper[4909]: I1002 18:39:55.832458 4909 generic.go:334] "Generic (PLEG): container finished" podID="21ea0645-5a72-4be7-b21b-d6abdfd9b351" containerID="b61dafdeef24d57495e7580cbb1b2607e83e6d404a7dac0e4f7811f3bdd9a83e" exitCode=0 Oct 02 18:39:55 crc kubenswrapper[4909]: I1002 18:39:55.832505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" event={"ID":"21ea0645-5a72-4be7-b21b-d6abdfd9b351","Type":"ContainerDied","Data":"b61dafdeef24d57495e7580cbb1b2607e83e6d404a7dac0e4f7811f3bdd9a83e"} Oct 02 18:39:55 crc kubenswrapper[4909]: I1002 18:39:55.840347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" event={"ID":"cf0443bc-15f9-47c0-9105-a1acb3ff6998","Type":"ContainerStarted","Data":"fa9182a132ad12da56516fcf8b1394d324535b8fbc23ed690bd320a6c3bb205e"} Oct 02 18:39:55 crc kubenswrapper[4909]: I1002 18:39:55.845137 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fdf9896db-xbknn" event={"ID":"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6","Type":"ContainerStarted","Data":"517ead285c1fccc390441a6407713289e89f099fc5c336bae3eb329c6faa0e7c"} Oct 02 18:39:56 crc kubenswrapper[4909]: I1002 18:39:56.853654 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fdf9896db-xbknn" event={"ID":"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6","Type":"ContainerStarted","Data":"5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de"} Oct 02 18:39:56 crc kubenswrapper[4909]: I1002 18:39:56.859347 4909 generic.go:334] "Generic (PLEG): container finished" podID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerID="7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d" exitCode=0 Oct 02 18:39:56 crc kubenswrapper[4909]: I1002 18:39:56.859387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" event={"ID":"cf0443bc-15f9-47c0-9105-a1acb3ff6998","Type":"ContainerDied","Data":"7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d"} Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.054594 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.142382 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-sb\") pod \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.142917 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-svc\") pod \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.142949 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-config\") pod \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.143411 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-nb\") pod \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.143465 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc5c2\" (UniqueName: \"kubernetes.io/projected/21ea0645-5a72-4be7-b21b-d6abdfd9b351-kube-api-access-hc5c2\") pod \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.143502 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-swift-storage-0\") pod \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\" (UID: \"21ea0645-5a72-4be7-b21b-d6abdfd9b351\") " Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.158755 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ea0645-5a72-4be7-b21b-d6abdfd9b351-kube-api-access-hc5c2" (OuterVolumeSpecName: "kube-api-access-hc5c2") pod "21ea0645-5a72-4be7-b21b-d6abdfd9b351" (UID: "21ea0645-5a72-4be7-b21b-d6abdfd9b351"). InnerVolumeSpecName "kube-api-access-hc5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.184781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21ea0645-5a72-4be7-b21b-d6abdfd9b351" (UID: "21ea0645-5a72-4be7-b21b-d6abdfd9b351"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.237392 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-config" (OuterVolumeSpecName: "config") pod "21ea0645-5a72-4be7-b21b-d6abdfd9b351" (UID: "21ea0645-5a72-4be7-b21b-d6abdfd9b351"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.241658 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21ea0645-5a72-4be7-b21b-d6abdfd9b351" (UID: "21ea0645-5a72-4be7-b21b-d6abdfd9b351"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.245884 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.245910 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.245918 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.245929 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc5c2\" (UniqueName: \"kubernetes.io/projected/21ea0645-5a72-4be7-b21b-d6abdfd9b351-kube-api-access-hc5c2\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.247429 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21ea0645-5a72-4be7-b21b-d6abdfd9b351" (UID: "21ea0645-5a72-4be7-b21b-d6abdfd9b351"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.285267 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21ea0645-5a72-4be7-b21b-d6abdfd9b351" (UID: "21ea0645-5a72-4be7-b21b-d6abdfd9b351"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.347992 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.348034 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea0645-5a72-4be7-b21b-d6abdfd9b351-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.524421 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-849fddf4f5-f44kx"] Oct 02 18:39:57 crc kubenswrapper[4909]: E1002 18:39:57.524934 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ea0645-5a72-4be7-b21b-d6abdfd9b351" containerName="init" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.524957 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ea0645-5a72-4be7-b21b-d6abdfd9b351" containerName="init" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.525218 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ea0645-5a72-4be7-b21b-d6abdfd9b351" containerName="init" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.526404 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.535662 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-849fddf4f5-f44kx"] Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.537337 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.537355 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552059 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-combined-ca-bundle\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552124 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-internal-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552205 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-ovndb-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552236 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-public-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552277 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-httpd-config\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552309 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xd22\" (UniqueName: \"kubernetes.io/projected/84148aec-ab89-4621-8448-b1bbbf294ad5-kube-api-access-9xd22\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.552336 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-config\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xd22\" (UniqueName: \"kubernetes.io/projected/84148aec-ab89-4621-8448-b1bbbf294ad5-kube-api-access-9xd22\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654221 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-config\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-combined-ca-bundle\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654342 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-internal-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654435 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-ovndb-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-public-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.654568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-httpd-config\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.663625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-combined-ca-bundle\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.665308 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-ovndb-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.666953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-public-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.668849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-config\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.669818 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-httpd-config\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.673626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84148aec-ab89-4621-8448-b1bbbf294ad5-internal-tls-certs\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.677586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xd22\" (UniqueName: \"kubernetes.io/projected/84148aec-ab89-4621-8448-b1bbbf294ad5-kube-api-access-9xd22\") pod \"neutron-849fddf4f5-f44kx\" (UID: \"84148aec-ab89-4621-8448-b1bbbf294ad5\") " pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.846490 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.869187 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" event={"ID":"21ea0645-5a72-4be7-b21b-d6abdfd9b351","Type":"ContainerDied","Data":"456fc7ec695d1af3111166af849706d1727727d8cd74a3df6d1a9bf035dab740"} Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.869238 4909 scope.go:117] "RemoveContainer" containerID="b61dafdeef24d57495e7580cbb1b2607e83e6d404a7dac0e4f7811f3bdd9a83e" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.869237 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-b6z29" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.874811 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerStarted","Data":"a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d"} Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.877758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" event={"ID":"cf0443bc-15f9-47c0-9105-a1acb3ff6998","Type":"ContainerStarted","Data":"178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2"} Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.878099 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.882623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fdf9896db-xbknn" event={"ID":"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6","Type":"ContainerStarted","Data":"1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd"} Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.883472 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.940553 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-b6z29"] Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.955767 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-b6z29"] Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.961783 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fdf9896db-xbknn" podStartSLOduration=3.961764161 podStartE2EDuration="3.961764161s" podCreationTimestamp="2025-10-02 18:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:57.940448505 +0000 UTC m=+1319.127944364" watchObservedRunningTime="2025-10-02 18:39:57.961764161 +0000 UTC m=+1319.149260020" Oct 02 18:39:57 crc kubenswrapper[4909]: I1002 18:39:57.980504 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" podStartSLOduration=3.980482517 podStartE2EDuration="3.980482517s" podCreationTimestamp="2025-10-02 18:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:39:57.95596557 +0000 UTC m=+1319.143461429" watchObservedRunningTime="2025-10-02 18:39:57.980482517 +0000 UTC m=+1319.167978376" Oct 02 18:39:58 crc kubenswrapper[4909]: I1002 18:39:58.178919 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 18:39:58 crc kubenswrapper[4909]: I1002 18:39:58.450871 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-849fddf4f5-f44kx"] Oct 02 18:39:58 crc kubenswrapper[4909]: W1002 18:39:58.451098 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84148aec_ab89_4621_8448_b1bbbf294ad5.slice/crio-d5dbae20393c154036022cfdb71a261f4425db3c2af26cc309cb5e01bfb84e68 WatchSource:0}: Error finding container d5dbae20393c154036022cfdb71a261f4425db3c2af26cc309cb5e01bfb84e68: Status 404 returned error can't find the container with id d5dbae20393c154036022cfdb71a261f4425db3c2af26cc309cb5e01bfb84e68 Oct 02 18:39:58 crc kubenswrapper[4909]: I1002 18:39:58.904720 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-849fddf4f5-f44kx" event={"ID":"84148aec-ab89-4621-8448-b1bbbf294ad5","Type":"ContainerStarted","Data":"d5dbae20393c154036022cfdb71a261f4425db3c2af26cc309cb5e01bfb84e68"} Oct 02 18:39:59 crc kubenswrapper[4909]: I1002 18:39:59.619057 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ea0645-5a72-4be7-b21b-d6abdfd9b351" path="/var/lib/kubelet/pods/21ea0645-5a72-4be7-b21b-d6abdfd9b351/volumes" Oct 02 18:39:59 crc kubenswrapper[4909]: I1002 18:39:59.925762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-849fddf4f5-f44kx" event={"ID":"84148aec-ab89-4621-8448-b1bbbf294ad5","Type":"ContainerStarted","Data":"83b99730225fc9272031c903f921f0d2718ce17f618cdbfb9e7f6a2bab594d29"} Oct 02 18:40:03 crc kubenswrapper[4909]: I1002 18:40:03.178249 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 18:40:03 crc kubenswrapper[4909]: I1002 18:40:03.188363 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 18:40:03 crc kubenswrapper[4909]: I1002 18:40:03.979925 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 18:40:04 crc kubenswrapper[4909]: I1002 18:40:04.961242 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:40:05 crc kubenswrapper[4909]: I1002 18:40:05.049201 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-9kwsd"] Oct 02 18:40:05 crc kubenswrapper[4909]: I1002 18:40:05.049417 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="dnsmasq-dns" containerID="cri-o://6cf01c83ac2ad4c8461040d5fb82d2fdf259acd2f9b842d337c2652f0f38d006" gracePeriod=10 Oct 02 18:40:05 crc kubenswrapper[4909]: I1002 18:40:05.997836 4909 generic.go:334] "Generic (PLEG): container finished" podID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerID="6cf01c83ac2ad4c8461040d5fb82d2fdf259acd2f9b842d337c2652f0f38d006" exitCode=0 Oct 02 18:40:05 crc kubenswrapper[4909]: I1002 18:40:05.997963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" event={"ID":"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b","Type":"ContainerDied","Data":"6cf01c83ac2ad4c8461040d5fb82d2fdf259acd2f9b842d337c2652f0f38d006"} Oct 02 18:40:05 crc kubenswrapper[4909]: I1002 18:40:05.998418 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" event={"ID":"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b","Type":"ContainerDied","Data":"e0bd53b8f78fb1e078d35c6dfd013113dd7a5bc15d51a761976e28dbe4d42795"} Oct 02 18:40:05 crc kubenswrapper[4909]: I1002 18:40:05.998433 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0bd53b8f78fb1e078d35c6dfd013113dd7a5bc15d51a761976e28dbe4d42795" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.069535 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.247711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-svc\") pod \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.247798 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-nb\") pod \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.247902 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llpj4\" (UniqueName: \"kubernetes.io/projected/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-kube-api-access-llpj4\") pod \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.248082 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-swift-storage-0\") pod \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.248145 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-sb\") pod \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.248255 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-config\") pod \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\" (UID: \"f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b\") " Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.264363 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-kube-api-access-llpj4" (OuterVolumeSpecName: "kube-api-access-llpj4") pod "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" (UID: "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b"). InnerVolumeSpecName "kube-api-access-llpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.317157 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" (UID: "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.321283 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" (UID: "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.333997 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" (UID: "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.337245 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" (UID: "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.350164 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llpj4\" (UniqueName: \"kubernetes.io/projected/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-kube-api-access-llpj4\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.350193 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.350202 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.350211 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.350219 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.351527 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-config" (OuterVolumeSpecName: "config") pod "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" (UID: "f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:06 crc kubenswrapper[4909]: I1002 18:40:06.452111 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:07 crc kubenswrapper[4909]: I1002 18:40:07.011068 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-849fddf4f5-f44kx" event={"ID":"84148aec-ab89-4621-8448-b1bbbf294ad5","Type":"ContainerStarted","Data":"c387b0bebfdca1a0d29270fa8ab82c87f821bfbfd5818cec7fc83c7062e05f00"} Oct 02 18:40:07 crc kubenswrapper[4909]: I1002 18:40:07.011098 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" Oct 02 18:40:07 crc kubenswrapper[4909]: I1002 18:40:07.042635 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-849fddf4f5-f44kx" podStartSLOduration=10.042619616 podStartE2EDuration="10.042619616s" podCreationTimestamp="2025-10-02 18:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:07.03441435 +0000 UTC m=+1328.221910209" watchObservedRunningTime="2025-10-02 18:40:07.042619616 +0000 UTC m=+1328.230115475" Oct 02 18:40:07 crc kubenswrapper[4909]: I1002 18:40:07.058776 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-9kwsd"] Oct 02 18:40:07 crc kubenswrapper[4909]: I1002 18:40:07.069884 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-9kwsd"] Oct 02 18:40:07 crc kubenswrapper[4909]: I1002 18:40:07.624473 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" path="/var/lib/kubelet/pods/f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b/volumes" Oct 02 18:40:08 crc kubenswrapper[4909]: I1002 18:40:08.028817 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e106d88-ec64-496f-b8a4-ca8137353399" containerID="de0e11e5ab4efe4d2925fe8200afb644af4140991c8ad1137253fa265bc7b920" exitCode=0 Oct 02 18:40:08 crc kubenswrapper[4909]: I1002 18:40:08.028873 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcr26" event={"ID":"4e106d88-ec64-496f-b8a4-ca8137353399","Type":"ContainerDied","Data":"de0e11e5ab4efe4d2925fe8200afb644af4140991c8ad1137253fa265bc7b920"} Oct 02 18:40:08 crc kubenswrapper[4909]: I1002 18:40:08.039991 4909 generic.go:334] "Generic (PLEG): container finished" podID="5bf2d8c4-630f-4e52-86b5-d20381e90564" containerID="3ad6fdc0d4c617c974619894478fecf07fd6e1e0f7dcf6f29cfd95062fcf3313" exitCode=0 Oct 02 18:40:08 crc kubenswrapper[4909]: I1002 18:40:08.040242 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zw6nz" event={"ID":"5bf2d8c4-630f-4e52-86b5-d20381e90564","Type":"ContainerDied","Data":"3ad6fdc0d4c617c974619894478fecf07fd6e1e0f7dcf6f29cfd95062fcf3313"} Oct 02 18:40:08 crc kubenswrapper[4909]: I1002 18:40:08.040316 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.052799 4909 generic.go:334] "Generic (PLEG): container finished" podID="e7f47e10-7e95-4d88-ae97-71b020828eaa" containerID="a4886512dd74e4bf92e75a07d365db5432c367595ddfd519be9f05f6fc8878b8" exitCode=0 Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.052879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f8b6m" event={"ID":"e7f47e10-7e95-4d88-ae97-71b020828eaa","Type":"ContainerDied","Data":"a4886512dd74e4bf92e75a07d365db5432c367595ddfd519be9f05f6fc8878b8"} Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.503100 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.596342 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zw6nz" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.624453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-combined-ca-bundle\") pod \"4e106d88-ec64-496f-b8a4-ca8137353399\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.624522 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzg8x\" (UniqueName: \"kubernetes.io/projected/4e106d88-ec64-496f-b8a4-ca8137353399-kube-api-access-fzg8x\") pod \"4e106d88-ec64-496f-b8a4-ca8137353399\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.624588 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-credential-keys\") pod \"4e106d88-ec64-496f-b8a4-ca8137353399\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.624621 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-fernet-keys\") pod \"4e106d88-ec64-496f-b8a4-ca8137353399\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.624651 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-scripts\") pod \"4e106d88-ec64-496f-b8a4-ca8137353399\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.624723 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-config-data\") pod \"4e106d88-ec64-496f-b8a4-ca8137353399\" (UID: \"4e106d88-ec64-496f-b8a4-ca8137353399\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.631230 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e106d88-ec64-496f-b8a4-ca8137353399-kube-api-access-fzg8x" (OuterVolumeSpecName: "kube-api-access-fzg8x") pod "4e106d88-ec64-496f-b8a4-ca8137353399" (UID: "4e106d88-ec64-496f-b8a4-ca8137353399"). InnerVolumeSpecName "kube-api-access-fzg8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.631351 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4e106d88-ec64-496f-b8a4-ca8137353399" (UID: "4e106d88-ec64-496f-b8a4-ca8137353399"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.639413 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-scripts" (OuterVolumeSpecName: "scripts") pod "4e106d88-ec64-496f-b8a4-ca8137353399" (UID: "4e106d88-ec64-496f-b8a4-ca8137353399"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.647629 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4e106d88-ec64-496f-b8a4-ca8137353399" (UID: "4e106d88-ec64-496f-b8a4-ca8137353399"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.652278 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e106d88-ec64-496f-b8a4-ca8137353399" (UID: "4e106d88-ec64-496f-b8a4-ca8137353399"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.667984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-config-data" (OuterVolumeSpecName: "config-data") pod "4e106d88-ec64-496f-b8a4-ca8137353399" (UID: "4e106d88-ec64-496f-b8a4-ca8137353399"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.726590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-scripts\") pod \"5bf2d8c4-630f-4e52-86b5-d20381e90564\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.726734 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf2d8c4-630f-4e52-86b5-d20381e90564-logs\") pod \"5bf2d8c4-630f-4e52-86b5-d20381e90564\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.726795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-combined-ca-bundle\") pod \"5bf2d8c4-630f-4e52-86b5-d20381e90564\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.726910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-config-data\") pod \"5bf2d8c4-630f-4e52-86b5-d20381e90564\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.727085 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj4vv\" (UniqueName: \"kubernetes.io/projected/5bf2d8c4-630f-4e52-86b5-d20381e90564-kube-api-access-fj4vv\") pod \"5bf2d8c4-630f-4e52-86b5-d20381e90564\" (UID: \"5bf2d8c4-630f-4e52-86b5-d20381e90564\") " Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.727509 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf2d8c4-630f-4e52-86b5-d20381e90564-logs" (OuterVolumeSpecName: "logs") pod "5bf2d8c4-630f-4e52-86b5-d20381e90564" (UID: "5bf2d8c4-630f-4e52-86b5-d20381e90564"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728160 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728265 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzg8x\" (UniqueName: \"kubernetes.io/projected/4e106d88-ec64-496f-b8a4-ca8137353399-kube-api-access-fzg8x\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728291 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf2d8c4-630f-4e52-86b5-d20381e90564-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728308 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728320 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728331 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.728343 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e106d88-ec64-496f-b8a4-ca8137353399-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.731462 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf2d8c4-630f-4e52-86b5-d20381e90564-kube-api-access-fj4vv" (OuterVolumeSpecName: "kube-api-access-fj4vv") pod "5bf2d8c4-630f-4e52-86b5-d20381e90564" (UID: "5bf2d8c4-630f-4e52-86b5-d20381e90564"). InnerVolumeSpecName "kube-api-access-fj4vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.731807 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-scripts" (OuterVolumeSpecName: "scripts") pod "5bf2d8c4-630f-4e52-86b5-d20381e90564" (UID: "5bf2d8c4-630f-4e52-86b5-d20381e90564"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.751451 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-config-data" (OuterVolumeSpecName: "config-data") pod "5bf2d8c4-630f-4e52-86b5-d20381e90564" (UID: "5bf2d8c4-630f-4e52-86b5-d20381e90564"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.770298 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bf2d8c4-630f-4e52-86b5-d20381e90564" (UID: "5bf2d8c4-630f-4e52-86b5-d20381e90564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.829981 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.830006 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj4vv\" (UniqueName: \"kubernetes.io/projected/5bf2d8c4-630f-4e52-86b5-d20381e90564-kube-api-access-fj4vv\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.830016 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:09 crc kubenswrapper[4909]: I1002 18:40:09.830035 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf2d8c4-630f-4e52-86b5-d20381e90564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.071174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcr26" event={"ID":"4e106d88-ec64-496f-b8a4-ca8137353399","Type":"ContainerDied","Data":"14205984170c742f0b8fca70cce529d3797101b1673a1dfec9c08a1aff4ae116"} Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.071202 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcr26" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.071739 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14205984170c742f0b8fca70cce529d3797101b1673a1dfec9c08a1aff4ae116" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.073466 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerStarted","Data":"b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544"} Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.076267 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zw6nz" event={"ID":"5bf2d8c4-630f-4e52-86b5-d20381e90564","Type":"ContainerDied","Data":"554c5a5737add0957862a893500ff0f28f4ac2b8cdfaa5a91c1bb5b00df09493"} Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.076296 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="554c5a5737add0957862a893500ff0f28f4ac2b8cdfaa5a91c1bb5b00df09493" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.076274 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zw6nz" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.078497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7dmx" event={"ID":"ce4584a9-c2b3-4d6e-976e-131b8b349d79","Type":"ContainerStarted","Data":"a1705cbe5e754667c3244841ddd57de8b2706814894832bcff4e971b74b672c7"} Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.089960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r4lmb" event={"ID":"af170421-1a33-4e1d-b1ef-83ac5388791c","Type":"ContainerStarted","Data":"1f18c200e8b9e3da31fd7d087179b8e4d18b339d06a9a90d5d79bb49b765685e"} Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.116924 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-k7dmx" podStartSLOduration=3.000201537 podStartE2EDuration="45.116909987s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="2025-10-02 18:39:26.732681772 +0000 UTC m=+1287.920177631" lastFinishedPulling="2025-10-02 18:40:08.849390222 +0000 UTC m=+1330.036886081" observedRunningTime="2025-10-02 18:40:10.114732109 +0000 UTC m=+1331.302227968" watchObservedRunningTime="2025-10-02 18:40:10.116909987 +0000 UTC m=+1331.304405846" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.271665 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-r4lmb" podStartSLOduration=2.826353995 podStartE2EDuration="42.271644883s" podCreationTimestamp="2025-10-02 18:39:28 +0000 UTC" firstStartedPulling="2025-10-02 18:39:29.659221546 +0000 UTC m=+1290.846717395" lastFinishedPulling="2025-10-02 18:40:09.104512414 +0000 UTC m=+1330.292008283" observedRunningTime="2025-10-02 18:40:10.157065392 +0000 UTC m=+1331.344561251" watchObservedRunningTime="2025-10-02 18:40:10.271644883 +0000 UTC m=+1331.459140742" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.279691 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8486f6d788-jp6q4"] Oct 02 18:40:10 crc kubenswrapper[4909]: E1002 18:40:10.280140 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="dnsmasq-dns" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280155 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="dnsmasq-dns" Oct 02 18:40:10 crc kubenswrapper[4909]: E1002 18:40:10.280207 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="init" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280214 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="init" Oct 02 18:40:10 crc kubenswrapper[4909]: E1002 18:40:10.280232 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e106d88-ec64-496f-b8a4-ca8137353399" containerName="keystone-bootstrap" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280247 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e106d88-ec64-496f-b8a4-ca8137353399" containerName="keystone-bootstrap" Oct 02 18:40:10 crc kubenswrapper[4909]: E1002 18:40:10.280260 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf2d8c4-630f-4e52-86b5-d20381e90564" containerName="placement-db-sync" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280266 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf2d8c4-630f-4e52-86b5-d20381e90564" containerName="placement-db-sync" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280446 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="dnsmasq-dns" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280461 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf2d8c4-630f-4e52-86b5-d20381e90564" containerName="placement-db-sync" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.280471 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e106d88-ec64-496f-b8a4-ca8137353399" containerName="keystone-bootstrap" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.281159 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.285680 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8486f6d788-jp6q4"] Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.286434 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.286612 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.287252 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.287543 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zsxbj" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.287658 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.287717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.382781 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57bc775dcb-fd69b"] Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.384737 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.389395 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.389555 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ghgpp" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.389648 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.390231 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.390397 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.393699 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57bc775dcb-fd69b"] Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452220 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64hsv\" (UniqueName: \"kubernetes.io/projected/eeeaf88e-e118-47c2-84b3-088443528f41-kube-api-access-64hsv\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452288 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-config-data\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452328 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-scripts\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452389 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-combined-ca-bundle\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452438 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-fernet-keys\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452493 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-internal-tls-certs\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452550 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-public-tls-certs\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.452603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-credential-keys\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554466 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-combined-ca-bundle\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554514 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-internal-tls-certs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-fernet-keys\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-internal-tls-certs\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554610 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qtdx\" (UniqueName: \"kubernetes.io/projected/af49e65e-4d40-4b78-8219-aa5d209825d0-kube-api-access-4qtdx\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554629 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-combined-ca-bundle\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-public-tls-certs\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554681 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af49e65e-4d40-4b78-8219-aa5d209825d0-logs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554711 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-credential-keys\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554742 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-config-data\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64hsv\" (UniqueName: \"kubernetes.io/projected/eeeaf88e-e118-47c2-84b3-088443528f41-kube-api-access-64hsv\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554802 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-config-data\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554826 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-scripts\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-scripts\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.554859 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-public-tls-certs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.562005 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-credential-keys\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.562125 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-fernet-keys\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.562220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-config-data\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.565162 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-scripts\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.565470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-public-tls-certs\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.565485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-combined-ca-bundle\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.565473 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeeaf88e-e118-47c2-84b3-088443528f41-internal-tls-certs\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.576600 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64hsv\" (UniqueName: \"kubernetes.io/projected/eeeaf88e-e118-47c2-84b3-088443528f41-kube-api-access-64hsv\") pod \"keystone-8486f6d788-jp6q4\" (UID: \"eeeaf88e-e118-47c2-84b3-088443528f41\") " pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.630657 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.660943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-scripts\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.660995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-public-tls-certs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.661192 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-internal-tls-certs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.661276 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qtdx\" (UniqueName: \"kubernetes.io/projected/af49e65e-4d40-4b78-8219-aa5d209825d0-kube-api-access-4qtdx\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.661307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-combined-ca-bundle\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.661349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af49e65e-4d40-4b78-8219-aa5d209825d0-logs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.661397 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-config-data\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.665755 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af49e65e-4d40-4b78-8219-aa5d209825d0-logs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.667114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-internal-tls-certs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.667905 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-config-data\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.669250 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-scripts\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.672636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-public-tls-certs\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.675088 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af49e65e-4d40-4b78-8219-aa5d209825d0-combined-ca-bundle\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.704344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qtdx\" (UniqueName: \"kubernetes.io/projected/af49e65e-4d40-4b78-8219-aa5d209825d0-kube-api-access-4qtdx\") pod \"placement-57bc775dcb-fd69b\" (UID: \"af49e65e-4d40-4b78-8219-aa5d209825d0\") " pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.771752 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f8b6m" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.932213 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-9kwsd" podUID="f9b4cb24-4aa2-4311-ac0b-9664ff0ec33b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.967312 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kq79\" (UniqueName: \"kubernetes.io/projected/e7f47e10-7e95-4d88-ae97-71b020828eaa-kube-api-access-9kq79\") pod \"e7f47e10-7e95-4d88-ae97-71b020828eaa\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.968708 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-combined-ca-bundle\") pod \"e7f47e10-7e95-4d88-ae97-71b020828eaa\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.968791 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-config-data\") pod \"e7f47e10-7e95-4d88-ae97-71b020828eaa\" (UID: \"e7f47e10-7e95-4d88-ae97-71b020828eaa\") " Oct 02 18:40:10 crc kubenswrapper[4909]: I1002 18:40:10.986404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f47e10-7e95-4d88-ae97-71b020828eaa-kube-api-access-9kq79" (OuterVolumeSpecName: "kube-api-access-9kq79") pod "e7f47e10-7e95-4d88-ae97-71b020828eaa" (UID: "e7f47e10-7e95-4d88-ae97-71b020828eaa"). InnerVolumeSpecName "kube-api-access-9kq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.003688 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.019778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7f47e10-7e95-4d88-ae97-71b020828eaa" (UID: "e7f47e10-7e95-4d88-ae97-71b020828eaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.071597 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kq79\" (UniqueName: \"kubernetes.io/projected/e7f47e10-7e95-4d88-ae97-71b020828eaa-kube-api-access-9kq79\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.071624 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.143225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-config-data" (OuterVolumeSpecName: "config-data") pod "e7f47e10-7e95-4d88-ae97-71b020828eaa" (UID: "e7f47e10-7e95-4d88-ae97-71b020828eaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.173361 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f47e10-7e95-4d88-ae97-71b020828eaa-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.188456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f8b6m" event={"ID":"e7f47e10-7e95-4d88-ae97-71b020828eaa","Type":"ContainerDied","Data":"bd01683c35a5552ccba798a84d504f9321324adc43ada5ab439c290cca8e9851"} Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.188498 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd01683c35a5552ccba798a84d504f9321324adc43ada5ab439c290cca8e9851" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.188519 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f8b6m" Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.249706 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8486f6d788-jp6q4"] Oct 02 18:40:11 crc kubenswrapper[4909]: I1002 18:40:11.667584 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57bc775dcb-fd69b"] Oct 02 18:40:12 crc kubenswrapper[4909]: I1002 18:40:12.203473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57bc775dcb-fd69b" event={"ID":"af49e65e-4d40-4b78-8219-aa5d209825d0","Type":"ContainerStarted","Data":"ebb6113928bd7f6921b89f35c2602a27ffe841e2c64e8e6612c4dbb4055f2f0d"} Oct 02 18:40:12 crc kubenswrapper[4909]: I1002 18:40:12.208727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8486f6d788-jp6q4" event={"ID":"eeeaf88e-e118-47c2-84b3-088443528f41","Type":"ContainerStarted","Data":"0115d6906f7931fa25a7ce4643b863dfd991af354ba34b9e96ca70a1d68ab3e0"} Oct 02 18:40:12 crc kubenswrapper[4909]: I1002 18:40:12.208809 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8486f6d788-jp6q4" event={"ID":"eeeaf88e-e118-47c2-84b3-088443528f41","Type":"ContainerStarted","Data":"b64003fb4b53733883a581808320e232d120a3b9746684efb2f3c578a9d74693"} Oct 02 18:40:13 crc kubenswrapper[4909]: I1002 18:40:13.222771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57bc775dcb-fd69b" event={"ID":"af49e65e-4d40-4b78-8219-aa5d209825d0","Type":"ContainerStarted","Data":"32e6f921462ad88b08ce27e9240a85e4dd6c5659198a5b0ba5bfe107c5280233"} Oct 02 18:40:13 crc kubenswrapper[4909]: I1002 18:40:13.223802 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:13 crc kubenswrapper[4909]: I1002 18:40:13.223829 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57bc775dcb-fd69b" event={"ID":"af49e65e-4d40-4b78-8219-aa5d209825d0","Type":"ContainerStarted","Data":"d4f9673a6269bba7adee24666a848efdd010edd1e8d5b569717b7b68ddb25822"} Oct 02 18:40:13 crc kubenswrapper[4909]: I1002 18:40:13.260194 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8486f6d788-jp6q4" podStartSLOduration=3.260176474 podStartE2EDuration="3.260176474s" podCreationTimestamp="2025-10-02 18:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:13.252436231 +0000 UTC m=+1334.439932090" watchObservedRunningTime="2025-10-02 18:40:13.260176474 +0000 UTC m=+1334.447672333" Oct 02 18:40:13 crc kubenswrapper[4909]: I1002 18:40:13.291980 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57bc775dcb-fd69b" podStartSLOduration=3.291962557 podStartE2EDuration="3.291962557s" podCreationTimestamp="2025-10-02 18:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:13.283215503 +0000 UTC m=+1334.470711372" watchObservedRunningTime="2025-10-02 18:40:13.291962557 +0000 UTC m=+1334.479458406" Oct 02 18:40:14 crc kubenswrapper[4909]: I1002 18:40:14.231859 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:14 crc kubenswrapper[4909]: I1002 18:40:14.232160 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:16 crc kubenswrapper[4909]: I1002 18:40:16.252053 4909 generic.go:334] "Generic (PLEG): container finished" podID="af170421-1a33-4e1d-b1ef-83ac5388791c" containerID="1f18c200e8b9e3da31fd7d087179b8e4d18b339d06a9a90d5d79bb49b765685e" exitCode=0 Oct 02 18:40:16 crc kubenswrapper[4909]: I1002 18:40:16.252144 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r4lmb" event={"ID":"af170421-1a33-4e1d-b1ef-83ac5388791c","Type":"ContainerDied","Data":"1f18c200e8b9e3da31fd7d087179b8e4d18b339d06a9a90d5d79bb49b765685e"} Oct 02 18:40:17 crc kubenswrapper[4909]: I1002 18:40:17.879410 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.016212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-db-sync-config-data\") pod \"af170421-1a33-4e1d-b1ef-83ac5388791c\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.016871 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-combined-ca-bundle\") pod \"af170421-1a33-4e1d-b1ef-83ac5388791c\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.016950 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d5gp\" (UniqueName: \"kubernetes.io/projected/af170421-1a33-4e1d-b1ef-83ac5388791c-kube-api-access-4d5gp\") pod \"af170421-1a33-4e1d-b1ef-83ac5388791c\" (UID: \"af170421-1a33-4e1d-b1ef-83ac5388791c\") " Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.020741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "af170421-1a33-4e1d-b1ef-83ac5388791c" (UID: "af170421-1a33-4e1d-b1ef-83ac5388791c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.028688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af170421-1a33-4e1d-b1ef-83ac5388791c-kube-api-access-4d5gp" (OuterVolumeSpecName: "kube-api-access-4d5gp") pod "af170421-1a33-4e1d-b1ef-83ac5388791c" (UID: "af170421-1a33-4e1d-b1ef-83ac5388791c"). InnerVolumeSpecName "kube-api-access-4d5gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.050560 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af170421-1a33-4e1d-b1ef-83ac5388791c" (UID: "af170421-1a33-4e1d-b1ef-83ac5388791c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.119907 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.119958 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af170421-1a33-4e1d-b1ef-83ac5388791c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.119977 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d5gp\" (UniqueName: \"kubernetes.io/projected/af170421-1a33-4e1d-b1ef-83ac5388791c-kube-api-access-4d5gp\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.277801 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" containerID="a1705cbe5e754667c3244841ddd57de8b2706814894832bcff4e971b74b672c7" exitCode=0 Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.277887 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7dmx" event={"ID":"ce4584a9-c2b3-4d6e-976e-131b8b349d79","Type":"ContainerDied","Data":"a1705cbe5e754667c3244841ddd57de8b2706814894832bcff4e971b74b672c7"} Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.280174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r4lmb" event={"ID":"af170421-1a33-4e1d-b1ef-83ac5388791c","Type":"ContainerDied","Data":"f7e0f0fe46b51391dd10e3a36c7067d3a7c2d886b21af2880fd1d88607d39adb"} Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.280214 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7e0f0fe46b51391dd10e3a36c7067d3a7c2d886b21af2880fd1d88607d39adb" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.280286 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r4lmb" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.282855 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerStarted","Data":"6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1"} Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.282975 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-central-agent" containerID="cri-o://7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a" gracePeriod=30 Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.283079 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-notification-agent" containerID="cri-o://a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d" gracePeriod=30 Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.283077 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="sg-core" containerID="cri-o://b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544" gracePeriod=30 Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.283090 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="proxy-httpd" containerID="cri-o://6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1" gracePeriod=30 Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.283100 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.353152 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.557618009 podStartE2EDuration="53.35313405s" podCreationTimestamp="2025-10-02 18:39:25 +0000 UTC" firstStartedPulling="2025-10-02 18:39:27.114290697 +0000 UTC m=+1288.301786556" lastFinishedPulling="2025-10-02 18:40:17.909806738 +0000 UTC m=+1339.097302597" observedRunningTime="2025-10-02 18:40:18.343551141 +0000 UTC m=+1339.531047000" watchObservedRunningTime="2025-10-02 18:40:18.35313405 +0000 UTC m=+1339.540629909" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.569496 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-77d5f8945f-mrwjk"] Oct 02 18:40:18 crc kubenswrapper[4909]: E1002 18:40:18.569908 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af170421-1a33-4e1d-b1ef-83ac5388791c" containerName="barbican-db-sync" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.569925 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="af170421-1a33-4e1d-b1ef-83ac5388791c" containerName="barbican-db-sync" Oct 02 18:40:18 crc kubenswrapper[4909]: E1002 18:40:18.569973 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f47e10-7e95-4d88-ae97-71b020828eaa" containerName="heat-db-sync" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.569979 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f47e10-7e95-4d88-ae97-71b020828eaa" containerName="heat-db-sync" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.570175 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="af170421-1a33-4e1d-b1ef-83ac5388791c" containerName="barbican-db-sync" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.570190 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f47e10-7e95-4d88-ae97-71b020828eaa" containerName="heat-db-sync" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.578620 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.582963 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.583548 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.587365 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vj7rl" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.589302 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77d5f8945f-mrwjk"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.635397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0709099c-cc29-4dbb-9874-ab921318a936-logs\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.635489 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-config-data\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.635547 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-combined-ca-bundle\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.635592 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-config-data-custom\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.635678 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwb2\" (UniqueName: \"kubernetes.io/projected/0709099c-cc29-4dbb-9874-ab921318a936-kube-api-access-pvwb2\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.686609 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75889c558-g9b2x"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.691567 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.704381 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.707331 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75889c558-g9b2x"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.724362 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-pl6dr"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.737586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-config-data\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.737841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-config-data-custom\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.737929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efde1707-49d8-4674-9cf8-3cd1de63b5c9-logs\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-combined-ca-bundle\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-config-data-custom\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwb2\" (UniqueName: \"kubernetes.io/projected/0709099c-cc29-4dbb-9874-ab921318a936-kube-api-access-pvwb2\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-combined-ca-bundle\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-config-data\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738450 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkq4\" (UniqueName: \"kubernetes.io/projected/efde1707-49d8-4674-9cf8-3cd1de63b5c9-kube-api-access-zqkq4\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.738599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0709099c-cc29-4dbb-9874-ab921318a936-logs\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.739006 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0709099c-cc29-4dbb-9874-ab921318a936-logs\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.745075 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-config-data\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.773950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-config-data-custom\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.777242 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwb2\" (UniqueName: \"kubernetes.io/projected/0709099c-cc29-4dbb-9874-ab921318a936-kube-api-access-pvwb2\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.779325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0709099c-cc29-4dbb-9874-ab921318a936-combined-ca-bundle\") pod \"barbican-worker-77d5f8945f-mrwjk\" (UID: \"0709099c-cc29-4dbb-9874-ab921318a936\") " pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.779706 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-pl6dr"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.779863 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.841414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-config-data-custom\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.841471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efde1707-49d8-4674-9cf8-3cd1de63b5c9-logs\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.841525 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-combined-ca-bundle\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.841563 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-config-data\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.841588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkq4\" (UniqueName: \"kubernetes.io/projected/efde1707-49d8-4674-9cf8-3cd1de63b5c9-kube-api-access-zqkq4\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.842523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efde1707-49d8-4674-9cf8-3cd1de63b5c9-logs\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.868737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkq4\" (UniqueName: \"kubernetes.io/projected/efde1707-49d8-4674-9cf8-3cd1de63b5c9-kube-api-access-zqkq4\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.875372 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-combined-ca-bundle\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.875709 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-config-data-custom\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.888796 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efde1707-49d8-4674-9cf8-3cd1de63b5c9-config-data\") pod \"barbican-keystone-listener-75889c558-g9b2x\" (UID: \"efde1707-49d8-4674-9cf8-3cd1de63b5c9\") " pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.913143 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56bc8c7d98-gqjkp"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.918257 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.926638 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.939456 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56bc8c7d98-gqjkp"] Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.943005 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478bt\" (UniqueName: \"kubernetes.io/projected/e0402b2a-0af0-4638-9488-c8d40da13cf9-kube-api-access-478bt\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.943078 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.943120 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.943141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-config\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.943179 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.943237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:18 crc kubenswrapper[4909]: I1002 18:40:18.953312 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77d5f8945f-mrwjk" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.044808 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1694f79-c022-44df-96d1-8eb2200685b4-logs\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-combined-ca-bundle\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045181 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxnr\" (UniqueName: \"kubernetes.io/projected/e1694f79-c022-44df-96d1-8eb2200685b4-kube-api-access-5hxnr\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045212 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478bt\" (UniqueName: \"kubernetes.io/projected/e0402b2a-0af0-4638-9488-c8d40da13cf9-kube-api-access-478bt\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045265 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045285 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-config\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045391 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.045453 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data-custom\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.046913 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.050327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.050580 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-config\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.050978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.052816 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.065483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478bt\" (UniqueName: \"kubernetes.io/projected/e0402b2a-0af0-4638-9488-c8d40da13cf9-kube-api-access-478bt\") pod \"dnsmasq-dns-75c8ddd69c-pl6dr\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.079575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75889c558-g9b2x" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.147459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.147608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data-custom\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.147711 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1694f79-c022-44df-96d1-8eb2200685b4-logs\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.147795 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-combined-ca-bundle\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.147861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxnr\" (UniqueName: \"kubernetes.io/projected/e1694f79-c022-44df-96d1-8eb2200685b4-kube-api-access-5hxnr\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.148558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1694f79-c022-44df-96d1-8eb2200685b4-logs\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.152871 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-combined-ca-bundle\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.153429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data-custom\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.153934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.170209 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxnr\" (UniqueName: \"kubernetes.io/projected/e1694f79-c022-44df-96d1-8eb2200685b4-kube-api-access-5hxnr\") pod \"barbican-api-56bc8c7d98-gqjkp\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.247533 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.258881 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.309045 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerID="6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1" exitCode=0 Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.309076 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerID="b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544" exitCode=2 Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.309087 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerID="7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a" exitCode=0 Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.309270 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerDied","Data":"6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1"} Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.309300 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerDied","Data":"b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544"} Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.309310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerDied","Data":"7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a"} Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.422951 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77d5f8945f-mrwjk"] Oct 02 18:40:19 crc kubenswrapper[4909]: W1002 18:40:19.440188 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0709099c_cc29_4dbb_9874_ab921318a936.slice/crio-09c7c8b25b16fac5b397447112663d6114f42b80cf4986512b1e775369de8bc4 WatchSource:0}: Error finding container 09c7c8b25b16fac5b397447112663d6114f42b80cf4986512b1e775369de8bc4: Status 404 returned error can't find the container with id 09c7c8b25b16fac5b397447112663d6114f42b80cf4986512b1e775369de8bc4 Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.630806 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75889c558-g9b2x"] Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.769154 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.866708 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hbfl\" (UniqueName: \"kubernetes.io/projected/ce4584a9-c2b3-4d6e-976e-131b8b349d79-kube-api-access-7hbfl\") pod \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.867777 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-scripts\") pod \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.867865 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-config-data\") pod \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.867889 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4584a9-c2b3-4d6e-976e-131b8b349d79-etc-machine-id\") pod \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.867922 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-combined-ca-bundle\") pod \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.868013 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-db-sync-config-data\") pod \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\" (UID: \"ce4584a9-c2b3-4d6e-976e-131b8b349d79\") " Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.870152 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce4584a9-c2b3-4d6e-976e-131b8b349d79-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ce4584a9-c2b3-4d6e-976e-131b8b349d79" (UID: "ce4584a9-c2b3-4d6e-976e-131b8b349d79"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.873949 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ce4584a9-c2b3-4d6e-976e-131b8b349d79" (UID: "ce4584a9-c2b3-4d6e-976e-131b8b349d79"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.887874 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-scripts" (OuterVolumeSpecName: "scripts") pod "ce4584a9-c2b3-4d6e-976e-131b8b349d79" (UID: "ce4584a9-c2b3-4d6e-976e-131b8b349d79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.888411 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4584a9-c2b3-4d6e-976e-131b8b349d79-kube-api-access-7hbfl" (OuterVolumeSpecName: "kube-api-access-7hbfl") pod "ce4584a9-c2b3-4d6e-976e-131b8b349d79" (UID: "ce4584a9-c2b3-4d6e-976e-131b8b349d79"). InnerVolumeSpecName "kube-api-access-7hbfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.949755 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-config-data" (OuterVolumeSpecName: "config-data") pod "ce4584a9-c2b3-4d6e-976e-131b8b349d79" (UID: "ce4584a9-c2b3-4d6e-976e-131b8b349d79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.954710 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce4584a9-c2b3-4d6e-976e-131b8b349d79" (UID: "ce4584a9-c2b3-4d6e-976e-131b8b349d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.970718 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.970759 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4584a9-c2b3-4d6e-976e-131b8b349d79-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.970774 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.970786 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.970798 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hbfl\" (UniqueName: \"kubernetes.io/projected/ce4584a9-c2b3-4d6e-976e-131b8b349d79-kube-api-access-7hbfl\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.970809 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4584a9-c2b3-4d6e-976e-131b8b349d79-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.982001 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-pl6dr"] Oct 02 18:40:19 crc kubenswrapper[4909]: I1002 18:40:19.993306 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56bc8c7d98-gqjkp"] Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.339925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75889c558-g9b2x" event={"ID":"efde1707-49d8-4674-9cf8-3cd1de63b5c9","Type":"ContainerStarted","Data":"1dd611309d77ad7740b3daca9b97fe41f17957c3c70a2104da9b68f163d79add"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.346159 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bc8c7d98-gqjkp" event={"ID":"e1694f79-c022-44df-96d1-8eb2200685b4","Type":"ContainerStarted","Data":"aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.346223 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bc8c7d98-gqjkp" event={"ID":"e1694f79-c022-44df-96d1-8eb2200685b4","Type":"ContainerStarted","Data":"2f1e211c7c0388ef9e2f0268db552e23c099ce1742a623377e5002b7e5cec761"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.348916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d5f8945f-mrwjk" event={"ID":"0709099c-cc29-4dbb-9874-ab921318a936","Type":"ContainerStarted","Data":"09c7c8b25b16fac5b397447112663d6114f42b80cf4986512b1e775369de8bc4"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.351660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7dmx" event={"ID":"ce4584a9-c2b3-4d6e-976e-131b8b349d79","Type":"ContainerDied","Data":"369596de19d5c0db8a519bda3920f1ca4bec019b7dd5eaeb6ccf13df6417a693"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.351688 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="369596de19d5c0db8a519bda3920f1ca4bec019b7dd5eaeb6ccf13df6417a693" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.351748 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7dmx" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.355321 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" event={"ID":"e0402b2a-0af0-4638-9488-c8d40da13cf9","Type":"ContainerStarted","Data":"ea7c6634708fb2ccdaa117907f289d8d685f9fe18574f46eadf2aec77cb9e1b3"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.355380 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" event={"ID":"e0402b2a-0af0-4638-9488-c8d40da13cf9","Type":"ContainerStarted","Data":"c259e21b3e4351acacf95ab3a27e9fa4d80d272f3f8952dd1935bd4e0cc1ad78"} Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.611279 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-pl6dr"] Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.670434 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-pqdnn"] Oct 02 18:40:20 crc kubenswrapper[4909]: E1002 18:40:20.671122 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" containerName="cinder-db-sync" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.671135 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" containerName="cinder-db-sync" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.671296 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" containerName="cinder-db-sync" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.672334 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.683803 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-pqdnn"] Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.689254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-svc\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.689296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.689604 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d8z8\" (UniqueName: \"kubernetes.io/projected/4941e170-ef64-42b6-b9a1-56deb33e252d-kube-api-access-9d8z8\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.689785 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.689918 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.689963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-config\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.793311 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d8z8\" (UniqueName: \"kubernetes.io/projected/4941e170-ef64-42b6-b9a1-56deb33e252d-kube-api-access-9d8z8\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.793428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.793493 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.793521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-config\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.793562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-svc\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.793603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.794775 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.795967 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.796518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-config\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.796749 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.797172 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-svc\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.826699 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d8z8\" (UniqueName: \"kubernetes.io/projected/4941e170-ef64-42b6-b9a1-56deb33e252d-kube-api-access-9d8z8\") pod \"dnsmasq-dns-5784cf869f-pqdnn\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.997293 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:20 crc kubenswrapper[4909]: I1002 18:40:20.999111 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.001084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmbp\" (UniqueName: \"kubernetes.io/projected/398fdf24-c388-4ddf-8489-99f9e9279e08-kube-api-access-wqmbp\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.001156 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.001215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-scripts\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.001256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.001275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/398fdf24-c388-4ddf-8489-99f9e9279e08-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.001292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.033645 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.034242 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.093285 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.093618 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.093755 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v9vvh" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.093892 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.137843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.149566 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-scripts\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.149833 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.149898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/398fdf24-c388-4ddf-8489-99f9e9279e08-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.149923 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.149997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmbp\" (UniqueName: \"kubernetes.io/projected/398fdf24-c388-4ddf-8489-99f9e9279e08-kube-api-access-wqmbp\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.150650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/398fdf24-c388-4ddf-8489-99f9e9279e08-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.168191 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.177906 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.186241 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.186673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.192137 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmbp\" (UniqueName: \"kubernetes.io/projected/398fdf24-c388-4ddf-8489-99f9e9279e08-kube-api-access-wqmbp\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.214175 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-scripts\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.229396 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.229499 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.231449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data\") pod \"cinder-scheduler-0\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28601611-0f12-49b5-8d3f-c74c414edcd7-logs\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28601611-0f12-49b5-8d3f-c74c414edcd7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253324 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253344 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-scripts\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253395 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9sk\" (UniqueName: \"kubernetes.io/projected/28601611-0f12-49b5-8d3f-c74c414edcd7-kube-api-access-vv9sk\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.253576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data-custom\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.271346 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28601611-0f12-49b5-8d3f-c74c414edcd7-logs\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357217 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28601611-0f12-49b5-8d3f-c74c414edcd7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357283 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-scripts\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357306 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9sk\" (UniqueName: \"kubernetes.io/projected/28601611-0f12-49b5-8d3f-c74c414edcd7-kube-api-access-vv9sk\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.357478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data-custom\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.361447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.361678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28601611-0f12-49b5-8d3f-c74c414edcd7-logs\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.361712 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28601611-0f12-49b5-8d3f-c74c414edcd7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.362313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data-custom\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.365942 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.372420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-scripts\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.376857 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bc8c7d98-gqjkp" event={"ID":"e1694f79-c022-44df-96d1-8eb2200685b4","Type":"ContainerStarted","Data":"53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011"} Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.377924 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.377949 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.378703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9sk\" (UniqueName: \"kubernetes.io/projected/28601611-0f12-49b5-8d3f-c74c414edcd7-kube-api-access-vv9sk\") pod \"cinder-api-0\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.379786 4909 generic.go:334] "Generic (PLEG): container finished" podID="e0402b2a-0af0-4638-9488-c8d40da13cf9" containerID="ea7c6634708fb2ccdaa117907f289d8d685f9fe18574f46eadf2aec77cb9e1b3" exitCode=0 Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.379811 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" event={"ID":"e0402b2a-0af0-4638-9488-c8d40da13cf9","Type":"ContainerDied","Data":"ea7c6634708fb2ccdaa117907f289d8d685f9fe18574f46eadf2aec77cb9e1b3"} Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.438856 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56bc8c7d98-gqjkp" podStartSLOduration=3.4388394780000002 podStartE2EDuration="3.438839478s" podCreationTimestamp="2025-10-02 18:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:21.405234618 +0000 UTC m=+1342.592730477" watchObservedRunningTime="2025-10-02 18:40:21.438839478 +0000 UTC m=+1342.626335337" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.607498 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 18:40:21 crc kubenswrapper[4909]: E1002 18:40:21.786727 4909 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 02 18:40:21 crc kubenswrapper[4909]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e0402b2a-0af0-4638-9488-c8d40da13cf9/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 18:40:21 crc kubenswrapper[4909]: > podSandboxID="c259e21b3e4351acacf95ab3a27e9fa4d80d272f3f8952dd1935bd4e0cc1ad78" Oct 02 18:40:21 crc kubenswrapper[4909]: E1002 18:40:21.786912 4909 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 02 18:40:21 crc kubenswrapper[4909]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h86hd4h5f9hc8h599h5h56bh75h554h597h5f4hb7h98h58fh66ch57ch668h5bfhd8h596h68dh54h8ch674h587h5bdhb9hc4h695h5b8hccq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-478bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75c8ddd69c-pl6dr_openstack(e0402b2a-0af0-4638-9488-c8d40da13cf9): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e0402b2a-0af0-4638-9488-c8d40da13cf9/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 18:40:21 crc kubenswrapper[4909]: > logger="UnhandledError" Oct 02 18:40:21 crc kubenswrapper[4909]: E1002 18:40:21.788132 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e0402b2a-0af0-4638-9488-c8d40da13cf9/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" podUID="e0402b2a-0af0-4638-9488-c8d40da13cf9" Oct 02 18:40:21 crc kubenswrapper[4909]: I1002 18:40:21.809178 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-pqdnn"] Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.264142 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:22 crc kubenswrapper[4909]: W1002 18:40:22.271011 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398fdf24_c388_4ddf_8489_99f9e9279e08.slice/crio-dbfa4cabb1ecca0684a429d2823d860c7cafc61e89125d408f768ada7728ec38 WatchSource:0}: Error finding container dbfa4cabb1ecca0684a429d2823d860c7cafc61e89125d408f768ada7728ec38: Status 404 returned error can't find the container with id dbfa4cabb1ecca0684a429d2823d860c7cafc61e89125d408f768ada7728ec38 Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.392545 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.412969 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerID="a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d" exitCode=0 Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.413037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerDied","Data":"a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d"} Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.413068 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a5af68b-015f-4b28-a85f-a71b2b000e6a","Type":"ContainerDied","Data":"c7b09797e2f09efe71758f23d6deba689f9668f9c05a3489c308ff3f9e65303a"} Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.413085 4909 scope.go:117] "RemoveContainer" containerID="6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.413207 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.415469 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"398fdf24-c388-4ddf-8489-99f9e9279e08","Type":"ContainerStarted","Data":"dbfa4cabb1ecca0684a429d2823d860c7cafc61e89125d408f768ada7728ec38"} Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.423425 4909 generic.go:334] "Generic (PLEG): container finished" podID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerID="7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924" exitCode=0 Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.424192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" event={"ID":"4941e170-ef64-42b6-b9a1-56deb33e252d","Type":"ContainerDied","Data":"7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924"} Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.424247 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" event={"ID":"4941e170-ef64-42b6-b9a1-56deb33e252d","Type":"ContainerStarted","Data":"3959718f01c1c86915613f546a8782643ee41fe405506816c775bd18c64c3038"} Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.473682 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498147 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28xm5\" (UniqueName: \"kubernetes.io/projected/4a5af68b-015f-4b28-a85f-a71b2b000e6a-kube-api-access-28xm5\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-run-httpd\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498276 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-combined-ca-bundle\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498631 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498731 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-sg-core-conf-yaml\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498768 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-log-httpd\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.498787 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-config-data\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.499122 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.499182 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-scripts\") pod \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\" (UID: \"4a5af68b-015f-4b28-a85f-a71b2b000e6a\") " Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.501740 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.501853 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a5af68b-015f-4b28-a85f-a71b2b000e6a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.505257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-scripts" (OuterVolumeSpecName: "scripts") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.505280 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5af68b-015f-4b28-a85f-a71b2b000e6a-kube-api-access-28xm5" (OuterVolumeSpecName: "kube-api-access-28xm5") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "kube-api-access-28xm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.526686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.589193 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.603388 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.603420 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28xm5\" (UniqueName: \"kubernetes.io/projected/4a5af68b-015f-4b28-a85f-a71b2b000e6a-kube-api-access-28xm5\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.603431 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.603441 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.608618 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-config-data" (OuterVolumeSpecName: "config-data") pod "4a5af68b-015f-4b28-a85f-a71b2b000e6a" (UID: "4a5af68b-015f-4b28-a85f-a71b2b000e6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.705304 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5af68b-015f-4b28-a85f-a71b2b000e6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.760140 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.765286 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.795492 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:22 crc kubenswrapper[4909]: E1002 18:40:22.796084 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-central-agent" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796234 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-central-agent" Oct 02 18:40:22 crc kubenswrapper[4909]: E1002 18:40:22.796299 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="sg-core" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796367 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="sg-core" Oct 02 18:40:22 crc kubenswrapper[4909]: E1002 18:40:22.796423 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-notification-agent" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796480 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-notification-agent" Oct 02 18:40:22 crc kubenswrapper[4909]: E1002 18:40:22.796549 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="proxy-httpd" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796606 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="proxy-httpd" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796841 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-notification-agent" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796904 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="ceilometer-central-agent" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.796970 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="proxy-httpd" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.797033 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" containerName="sg-core" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.798792 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.805439 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.807019 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.831109 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.909445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-scripts\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.910362 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-config-data\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.910501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.910581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-log-httpd\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.910701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-run-httpd\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.910773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwf2\" (UniqueName: \"kubernetes.io/projected/3d6526c2-426e-4126-8692-7fc6066a4b4f-kube-api-access-ndwf2\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:22 crc kubenswrapper[4909]: I1002 18:40:22.910876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.012556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.013985 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-log-httpd\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014311 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-log-httpd\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-run-httpd\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014472 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwf2\" (UniqueName: \"kubernetes.io/projected/3d6526c2-426e-4126-8692-7fc6066a4b4f-kube-api-access-ndwf2\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014624 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014672 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-run-httpd\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014761 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-scripts\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.014803 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-config-data\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.017845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.019358 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-config-data\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.020532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-scripts\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.027742 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.046692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwf2\" (UniqueName: \"kubernetes.io/projected/3d6526c2-426e-4126-8692-7fc6066a4b4f-kube-api-access-ndwf2\") pod \"ceilometer-0\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.058192 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.058254 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.179189 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.619991 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5af68b-015f-4b28-a85f-a71b2b000e6a" path="/var/lib/kubelet/pods/4a5af68b-015f-4b28-a85f-a71b2b000e6a/volumes" Oct 02 18:40:23 crc kubenswrapper[4909]: W1002 18:40:23.734678 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28601611_0f12_49b5_8d3f_c74c414edcd7.slice/crio-6d1693f5b631bccab79403bed3c461822de6f43323d24403193c03ce307db2c4 WatchSource:0}: Error finding container 6d1693f5b631bccab79403bed3c461822de6f43323d24403193c03ce307db2c4: Status 404 returned error can't find the container with id 6d1693f5b631bccab79403bed3c461822de6f43323d24403193c03ce307db2c4 Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.779874 4909 scope.go:117] "RemoveContainer" containerID="b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.804371 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.830925 4909 scope.go:117] "RemoveContainer" containerID="a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.932246 4909 scope.go:117] "RemoveContainer" containerID="7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a" Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.938749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-nb\") pod \"e0402b2a-0af0-4638-9488-c8d40da13cf9\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.938916 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-sb\") pod \"e0402b2a-0af0-4638-9488-c8d40da13cf9\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.939016 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-svc\") pod \"e0402b2a-0af0-4638-9488-c8d40da13cf9\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.939109 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-config\") pod \"e0402b2a-0af0-4638-9488-c8d40da13cf9\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.939144 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-swift-storage-0\") pod \"e0402b2a-0af0-4638-9488-c8d40da13cf9\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.939183 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-478bt\" (UniqueName: \"kubernetes.io/projected/e0402b2a-0af0-4638-9488-c8d40da13cf9-kube-api-access-478bt\") pod \"e0402b2a-0af0-4638-9488-c8d40da13cf9\" (UID: \"e0402b2a-0af0-4638-9488-c8d40da13cf9\") " Oct 02 18:40:23 crc kubenswrapper[4909]: I1002 18:40:23.956950 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0402b2a-0af0-4638-9488-c8d40da13cf9-kube-api-access-478bt" (OuterVolumeSpecName: "kube-api-access-478bt") pod "e0402b2a-0af0-4638-9488-c8d40da13cf9" (UID: "e0402b2a-0af0-4638-9488-c8d40da13cf9"). InnerVolumeSpecName "kube-api-access-478bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.024583 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e0402b2a-0af0-4638-9488-c8d40da13cf9" (UID: "e0402b2a-0af0-4638-9488-c8d40da13cf9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.028957 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0402b2a-0af0-4638-9488-c8d40da13cf9" (UID: "e0402b2a-0af0-4638-9488-c8d40da13cf9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.036289 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0402b2a-0af0-4638-9488-c8d40da13cf9" (UID: "e0402b2a-0af0-4638-9488-c8d40da13cf9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.040499 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-config" (OuterVolumeSpecName: "config") pod "e0402b2a-0af0-4638-9488-c8d40da13cf9" (UID: "e0402b2a-0af0-4638-9488-c8d40da13cf9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.041503 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.041525 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.041534 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.041542 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.041551 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-478bt\" (UniqueName: \"kubernetes.io/projected/e0402b2a-0af0-4638-9488-c8d40da13cf9-kube-api-access-478bt\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.047707 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0402b2a-0af0-4638-9488-c8d40da13cf9" (UID: "e0402b2a-0af0-4638-9488-c8d40da13cf9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.144413 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0402b2a-0af0-4638-9488-c8d40da13cf9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.234791 4909 scope.go:117] "RemoveContainer" containerID="6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1" Oct 02 18:40:24 crc kubenswrapper[4909]: E1002 18:40:24.237686 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1\": container with ID starting with 6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1 not found: ID does not exist" containerID="6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.237735 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1"} err="failed to get container status \"6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1\": rpc error: code = NotFound desc = could not find container \"6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1\": container with ID starting with 6238c62c6162938ca838128e38f7d524f327b85396cd59e5146d5a50b6896fd1 not found: ID does not exist" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.237780 4909 scope.go:117] "RemoveContainer" containerID="b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544" Oct 02 18:40:24 crc kubenswrapper[4909]: E1002 18:40:24.238281 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544\": container with ID starting with b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544 not found: ID does not exist" containerID="b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.238304 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544"} err="failed to get container status \"b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544\": rpc error: code = NotFound desc = could not find container \"b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544\": container with ID starting with b26e40a8c083c5a30d74924e755880d962b8a2197acd1960062ab9fe2c8f7544 not found: ID does not exist" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.238318 4909 scope.go:117] "RemoveContainer" containerID="a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d" Oct 02 18:40:24 crc kubenswrapper[4909]: E1002 18:40:24.238553 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d\": container with ID starting with a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d not found: ID does not exist" containerID="a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.238575 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d"} err="failed to get container status \"a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d\": rpc error: code = NotFound desc = could not find container \"a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d\": container with ID starting with a483f4561b62b41d34b53bf6ec82dac0afe0c194a57cb399623c7d92b1c2eb1d not found: ID does not exist" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.238604 4909 scope.go:117] "RemoveContainer" containerID="7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a" Oct 02 18:40:24 crc kubenswrapper[4909]: E1002 18:40:24.239165 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a\": container with ID starting with 7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a not found: ID does not exist" containerID="7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.239206 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a"} err="failed to get container status \"7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a\": rpc error: code = NotFound desc = could not find container \"7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a\": container with ID starting with 7161639f45b1cfc94b079a44ebd5ab955d31a77d410f4bae95187f02d096330a not found: ID does not exist" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.371549 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:24 crc kubenswrapper[4909]: W1002 18:40:24.397843 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d6526c2_426e_4126_8692_7fc6066a4b4f.slice/crio-604b195a320e33551e15582c6e3c40ab05f64758a4b3c89fd0d2148e31a1ea57 WatchSource:0}: Error finding container 604b195a320e33551e15582c6e3c40ab05f64758a4b3c89fd0d2148e31a1ea57: Status 404 returned error can't find the container with id 604b195a320e33551e15582c6e3c40ab05f64758a4b3c89fd0d2148e31a1ea57 Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.484290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28601611-0f12-49b5-8d3f-c74c414edcd7","Type":"ContainerStarted","Data":"6d1693f5b631bccab79403bed3c461822de6f43323d24403193c03ce307db2c4"} Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.488813 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerStarted","Data":"604b195a320e33551e15582c6e3c40ab05f64758a4b3c89fd0d2148e31a1ea57"} Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.494625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d5f8945f-mrwjk" event={"ID":"0709099c-cc29-4dbb-9874-ab921318a936","Type":"ContainerStarted","Data":"089b175f79563febfa5bc72fe9646cf4ac6b418cde8c9f349804b6c57c2f69a4"} Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.496562 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" event={"ID":"4941e170-ef64-42b6-b9a1-56deb33e252d","Type":"ContainerStarted","Data":"6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03"} Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.497356 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.502249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" event={"ID":"e0402b2a-0af0-4638-9488-c8d40da13cf9","Type":"ContainerDied","Data":"c259e21b3e4351acacf95ab3a27e9fa4d80d272f3f8952dd1935bd4e0cc1ad78"} Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.502301 4909 scope.go:117] "RemoveContainer" containerID="ea7c6634708fb2ccdaa117907f289d8d685f9fe18574f46eadf2aec77cb9e1b3" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.502421 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-pl6dr" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.525780 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" podStartSLOduration=4.525764864 podStartE2EDuration="4.525764864s" podCreationTimestamp="2025-10-02 18:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:24.51862069 +0000 UTC m=+1345.706116549" watchObservedRunningTime="2025-10-02 18:40:24.525764864 +0000 UTC m=+1345.713260723" Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.611468 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-pl6dr"] Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.618505 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-pl6dr"] Oct 02 18:40:24 crc kubenswrapper[4909]: I1002 18:40:24.992093 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.290144 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.559275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerStarted","Data":"527dc8844e54e0f8f0ed6ed08a9f0b4f208b35b31b919f7a62527bcdad085a51"} Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.571437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"398fdf24-c388-4ddf-8489-99f9e9279e08","Type":"ContainerStarted","Data":"e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4"} Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.573185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d5f8945f-mrwjk" event={"ID":"0709099c-cc29-4dbb-9874-ab921318a936","Type":"ContainerStarted","Data":"e8b11e785ba69e5439bd480c7fff936f865cf0ba9fda210989207b42aaa4f7a8"} Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.597224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75889c558-g9b2x" event={"ID":"efde1707-49d8-4674-9cf8-3cd1de63b5c9","Type":"ContainerStarted","Data":"a55d4538fe51a16361f817f5892e8df3ce4336f4391df0e68efa405b54e2cb01"} Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.597272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75889c558-g9b2x" event={"ID":"efde1707-49d8-4674-9cf8-3cd1de63b5c9","Type":"ContainerStarted","Data":"8f1032ae17e48a7bfc97dd542f516f09527e5cd2d0a7a6bb877a47759597b683"} Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.656263 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-77d5f8945f-mrwjk" podStartSLOduration=3.259317348 podStartE2EDuration="7.656243787s" podCreationTimestamp="2025-10-02 18:40:18 +0000 UTC" firstStartedPulling="2025-10-02 18:40:19.45468845 +0000 UTC m=+1340.642184309" lastFinishedPulling="2025-10-02 18:40:23.851614879 +0000 UTC m=+1345.039110748" observedRunningTime="2025-10-02 18:40:25.610599361 +0000 UTC m=+1346.798095220" watchObservedRunningTime="2025-10-02 18:40:25.656243787 +0000 UTC m=+1346.843739646" Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.683325 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75889c558-g9b2x" podStartSLOduration=3.45072527 podStartE2EDuration="7.683306143s" podCreationTimestamp="2025-10-02 18:40:18 +0000 UTC" firstStartedPulling="2025-10-02 18:40:19.634831669 +0000 UTC m=+1340.822327528" lastFinishedPulling="2025-10-02 18:40:23.867412532 +0000 UTC m=+1345.054908401" observedRunningTime="2025-10-02 18:40:25.664593328 +0000 UTC m=+1346.852089187" watchObservedRunningTime="2025-10-02 18:40:25.683306143 +0000 UTC m=+1346.870802002" Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.752826 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0402b2a-0af0-4638-9488-c8d40da13cf9" path="/var/lib/kubelet/pods/e0402b2a-0af0-4638-9488-c8d40da13cf9/volumes" Oct 02 18:40:25 crc kubenswrapper[4909]: I1002 18:40:25.753702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28601611-0f12-49b5-8d3f-c74c414edcd7","Type":"ContainerStarted","Data":"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f"} Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.201568 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b6849dc4b-wmr8p"] Oct 02 18:40:26 crc kubenswrapper[4909]: E1002 18:40:26.202489 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0402b2a-0af0-4638-9488-c8d40da13cf9" containerName="init" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.202501 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0402b2a-0af0-4638-9488-c8d40da13cf9" containerName="init" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.202702 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0402b2a-0af0-4638-9488-c8d40da13cf9" containerName="init" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.203755 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.211701 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.215980 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.216362 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b6849dc4b-wmr8p"] Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224738 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-combined-ca-bundle\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-internal-tls-certs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-config-data\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224821 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcl4k\" (UniqueName: \"kubernetes.io/projected/b4918d43-b740-469f-9ee9-3011c688b622-kube-api-access-kcl4k\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224901 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-public-tls-certs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224926 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-config-data-custom\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.224955 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4918d43-b740-469f-9ee9-3011c688b622-logs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4918d43-b740-469f-9ee9-3011c688b622-logs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338748 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-combined-ca-bundle\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338776 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-internal-tls-certs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-config-data\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338818 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcl4k\" (UniqueName: \"kubernetes.io/projected/b4918d43-b740-469f-9ee9-3011c688b622-kube-api-access-kcl4k\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-public-tls-certs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.338902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-config-data-custom\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.339708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4918d43-b740-469f-9ee9-3011c688b622-logs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.348418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-config-data-custom\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.381094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-internal-tls-certs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.381557 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-combined-ca-bundle\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.381814 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-config-data\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.384060 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4918d43-b740-469f-9ee9-3011c688b622-public-tls-certs\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.393526 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcl4k\" (UniqueName: \"kubernetes.io/projected/b4918d43-b740-469f-9ee9-3011c688b622-kube-api-access-kcl4k\") pod \"barbican-api-6b6849dc4b-wmr8p\" (UID: \"b4918d43-b740-469f-9ee9-3011c688b622\") " pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.644511 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.724156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerStarted","Data":"ba2efee2ddc6b6c822e36a21c5d73a9636576b741d693e88c2b4db519ef01597"} Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.741180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"398fdf24-c388-4ddf-8489-99f9e9279e08","Type":"ContainerStarted","Data":"e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28"} Oct 02 18:40:26 crc kubenswrapper[4909]: I1002 18:40:26.782687 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.82132539 podStartE2EDuration="6.782668864s" podCreationTimestamp="2025-10-02 18:40:20 +0000 UTC" firstStartedPulling="2025-10-02 18:40:22.276364898 +0000 UTC m=+1343.463860757" lastFinishedPulling="2025-10-02 18:40:24.237708372 +0000 UTC m=+1345.425204231" observedRunningTime="2025-10-02 18:40:26.76398927 +0000 UTC m=+1347.951485129" watchObservedRunningTime="2025-10-02 18:40:26.782668864 +0000 UTC m=+1347.970164723" Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.250678 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b6849dc4b-wmr8p"] Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.753571 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28601611-0f12-49b5-8d3f-c74c414edcd7","Type":"ContainerStarted","Data":"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6"} Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.754006 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api-log" containerID="cri-o://24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f" gracePeriod=30 Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.754258 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.754488 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api" containerID="cri-o://0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6" gracePeriod=30 Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.788899 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.788879984 podStartE2EDuration="7.788879984s" podCreationTimestamp="2025-10-02 18:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:27.788587915 +0000 UTC m=+1348.976083774" watchObservedRunningTime="2025-10-02 18:40:27.788879984 +0000 UTC m=+1348.976375843" Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.796375 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerStarted","Data":"5d7a42c9551b8c804d2c17805805d7149da2bdc50ded0239161f6061ce008aba"} Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.813422 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6849dc4b-wmr8p" event={"ID":"b4918d43-b740-469f-9ee9-3011c688b622","Type":"ContainerStarted","Data":"02e1eb6b004b6f7c4426e5c8676c42010f64f3691093933f6793871804fb6ca7"} Oct 02 18:40:27 crc kubenswrapper[4909]: I1002 18:40:27.813538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6849dc4b-wmr8p" event={"ID":"b4918d43-b740-469f-9ee9-3011c688b622","Type":"ContainerStarted","Data":"3d7de4609692a6fd62875dd2d489064806a8d3501583c775b450f3dce5325a66"} Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.107772 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-849fddf4f5-f44kx" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.180200 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fdf9896db-xbknn"] Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.180740 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fdf9896db-xbknn" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-api" containerID="cri-o://5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de" gracePeriod=30 Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.181291 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fdf9896db-xbknn" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-httpd" containerID="cri-o://1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd" gracePeriod=30 Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.765787 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.845874 4909 generic.go:334] "Generic (PLEG): container finished" podID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerID="0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6" exitCode=0 Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.845906 4909 generic.go:334] "Generic (PLEG): container finished" podID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerID="24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f" exitCode=143 Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.845953 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.846245 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28601611-0f12-49b5-8d3f-c74c414edcd7","Type":"ContainerDied","Data":"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6"} Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.846302 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28601611-0f12-49b5-8d3f-c74c414edcd7","Type":"ContainerDied","Data":"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f"} Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.846313 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28601611-0f12-49b5-8d3f-c74c414edcd7","Type":"ContainerDied","Data":"6d1693f5b631bccab79403bed3c461822de6f43323d24403193c03ce307db2c4"} Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.846331 4909 scope.go:117] "RemoveContainer" containerID="0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.850396 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6849dc4b-wmr8p" event={"ID":"b4918d43-b740-469f-9ee9-3011c688b622","Type":"ContainerStarted","Data":"b3e9e0f5b10d757522dbd68a29917ba1c6085c9e94f4fb43a03ed9b97e53d4cc"} Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.850586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.858203 4909 generic.go:334] "Generic (PLEG): container finished" podID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerID="1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd" exitCode=0 Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.858244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fdf9896db-xbknn" event={"ID":"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6","Type":"ContainerDied","Data":"1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd"} Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.882553 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b6849dc4b-wmr8p" podStartSLOduration=2.8825346979999997 podStartE2EDuration="2.882534698s" podCreationTimestamp="2025-10-02 18:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:28.874780765 +0000 UTC m=+1350.062276624" watchObservedRunningTime="2025-10-02 18:40:28.882534698 +0000 UTC m=+1350.070030557" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904319 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28601611-0f12-49b5-8d3f-c74c414edcd7-logs\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904374 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data-custom\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904397 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28601611-0f12-49b5-8d3f-c74c414edcd7-etc-machine-id\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904411 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-scripts\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904571 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904627 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-combined-ca-bundle\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.904680 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv9sk\" (UniqueName: \"kubernetes.io/projected/28601611-0f12-49b5-8d3f-c74c414edcd7-kube-api-access-vv9sk\") pod \"28601611-0f12-49b5-8d3f-c74c414edcd7\" (UID: \"28601611-0f12-49b5-8d3f-c74c414edcd7\") " Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.906087 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28601611-0f12-49b5-8d3f-c74c414edcd7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.906269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28601611-0f12-49b5-8d3f-c74c414edcd7-logs" (OuterVolumeSpecName: "logs") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.913204 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28601611-0f12-49b5-8d3f-c74c414edcd7-kube-api-access-vv9sk" (OuterVolumeSpecName: "kube-api-access-vv9sk") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "kube-api-access-vv9sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.920202 4909 scope.go:117] "RemoveContainer" containerID="24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.952192 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-scripts" (OuterVolumeSpecName: "scripts") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.952286 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:28 crc kubenswrapper[4909]: I1002 18:40:28.992219 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.010854 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.010983 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv9sk\" (UniqueName: \"kubernetes.io/projected/28601611-0f12-49b5-8d3f-c74c414edcd7-kube-api-access-vv9sk\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.011077 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28601611-0f12-49b5-8d3f-c74c414edcd7-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.011136 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.011208 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28601611-0f12-49b5-8d3f-c74c414edcd7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.011266 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.092965 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data" (OuterVolumeSpecName: "config-data") pod "28601611-0f12-49b5-8d3f-c74c414edcd7" (UID: "28601611-0f12-49b5-8d3f-c74c414edcd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.107938 4909 scope.go:117] "RemoveContainer" containerID="0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6" Oct 02 18:40:29 crc kubenswrapper[4909]: E1002 18:40:29.110042 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6\": container with ID starting with 0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6 not found: ID does not exist" containerID="0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.110090 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6"} err="failed to get container status \"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6\": rpc error: code = NotFound desc = could not find container \"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6\": container with ID starting with 0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6 not found: ID does not exist" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.110117 4909 scope.go:117] "RemoveContainer" containerID="24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f" Oct 02 18:40:29 crc kubenswrapper[4909]: E1002 18:40:29.110556 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f\": container with ID starting with 24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f not found: ID does not exist" containerID="24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.110591 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f"} err="failed to get container status \"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f\": rpc error: code = NotFound desc = could not find container \"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f\": container with ID starting with 24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f not found: ID does not exist" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.110614 4909 scope.go:117] "RemoveContainer" containerID="0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.110844 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6"} err="failed to get container status \"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6\": rpc error: code = NotFound desc = could not find container \"0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6\": container with ID starting with 0d4cd8296d21f118a2456d61bf09669b7c20cc2ea28ecf4346a2d365d8ac29e6 not found: ID does not exist" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.110873 4909 scope.go:117] "RemoveContainer" containerID="24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.111098 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f"} err="failed to get container status \"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f\": rpc error: code = NotFound desc = could not find container \"24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f\": container with ID starting with 24d2b2953ba9642814b48438fd077e95c2754723a3e2547cf5d4575f78fae79f not found: ID does not exist" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.113414 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28601611-0f12-49b5-8d3f-c74c414edcd7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.176316 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.183943 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.205784 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:29 crc kubenswrapper[4909]: E1002 18:40:29.206890 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api-log" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.206975 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api-log" Oct 02 18:40:29 crc kubenswrapper[4909]: E1002 18:40:29.207070 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.207144 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.207466 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.207567 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" containerName="cinder-api-log" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.211266 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.213527 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.214667 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.216940 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.225840 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.316971 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-config-data\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317011 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317095 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/237e5a0a-a767-48de-aa6b-bb8cd24fd570-logs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/237e5a0a-a767-48de-aa6b-bb8cd24fd570-etc-machine-id\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317197 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-public-tls-certs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-config-data-custom\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317336 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6js\" (UniqueName: \"kubernetes.io/projected/237e5a0a-a767-48de-aa6b-bb8cd24fd570-kube-api-access-8x6js\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.317384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-scripts\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419241 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-config-data-custom\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6js\" (UniqueName: \"kubernetes.io/projected/237e5a0a-a767-48de-aa6b-bb8cd24fd570-kube-api-access-8x6js\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419391 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-scripts\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419455 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419472 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-config-data\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/237e5a0a-a767-48de-aa6b-bb8cd24fd570-logs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/237e5a0a-a767-48de-aa6b-bb8cd24fd570-etc-machine-id\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419575 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-public-tls-certs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.419748 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/237e5a0a-a767-48de-aa6b-bb8cd24fd570-etc-machine-id\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.420114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/237e5a0a-a767-48de-aa6b-bb8cd24fd570-logs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.425346 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-config-data\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.427446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-scripts\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.428451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.428623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.431372 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-config-data-custom\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.432686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/237e5a0a-a767-48de-aa6b-bb8cd24fd570-public-tls-certs\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.442622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6js\" (UniqueName: \"kubernetes.io/projected/237e5a0a-a767-48de-aa6b-bb8cd24fd570-kube-api-access-8x6js\") pod \"cinder-api-0\" (UID: \"237e5a0a-a767-48de-aa6b-bb8cd24fd570\") " pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.530775 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.622885 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28601611-0f12-49b5-8d3f-c74c414edcd7" path="/var/lib/kubelet/pods/28601611-0f12-49b5-8d3f-c74c414edcd7/volumes" Oct 02 18:40:29 crc kubenswrapper[4909]: I1002 18:40:29.875229 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:30 crc kubenswrapper[4909]: W1002 18:40:30.605290 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod237e5a0a_a767_48de_aa6b_bb8cd24fd570.slice/crio-8c5c21bd62769807b24e06d5d6e1c83c4f529491e34ed25245b2675ceccfd9af WatchSource:0}: Error finding container 8c5c21bd62769807b24e06d5d6e1c83c4f529491e34ed25245b2675ceccfd9af: Status 404 returned error can't find the container with id 8c5c21bd62769807b24e06d5d6e1c83c4f529491e34ed25245b2675ceccfd9af Oct 02 18:40:30 crc kubenswrapper[4909]: I1002 18:40:30.606503 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 18:40:30 crc kubenswrapper[4909]: I1002 18:40:30.902262 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerStarted","Data":"aacfdd3a860934cb6abe4638540916d8e9fd8aacfa066e7bdd033cde19f97a37"} Oct 02 18:40:30 crc kubenswrapper[4909]: I1002 18:40:30.903758 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:40:30 crc kubenswrapper[4909]: I1002 18:40:30.907089 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"237e5a0a-a767-48de-aa6b-bb8cd24fd570","Type":"ContainerStarted","Data":"8c5c21bd62769807b24e06d5d6e1c83c4f529491e34ed25245b2675ceccfd9af"} Oct 02 18:40:30 crc kubenswrapper[4909]: I1002 18:40:30.926547 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.8841886 podStartE2EDuration="8.926528265s" podCreationTimestamp="2025-10-02 18:40:22 +0000 UTC" firstStartedPulling="2025-10-02 18:40:24.409885803 +0000 UTC m=+1345.597381662" lastFinishedPulling="2025-10-02 18:40:29.452225468 +0000 UTC m=+1350.639721327" observedRunningTime="2025-10-02 18:40:30.924074979 +0000 UTC m=+1352.111570848" watchObservedRunningTime="2025-10-02 18:40:30.926528265 +0000 UTC m=+1352.114024124" Oct 02 18:40:30 crc kubenswrapper[4909]: I1002 18:40:30.994954 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.036852 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.111986 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6nhsg"] Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.112221 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerName="dnsmasq-dns" containerID="cri-o://178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2" gracePeriod=10 Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.154672 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.286516 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.828894 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.838123 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.936618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"237e5a0a-a767-48de-aa6b-bb8cd24fd570","Type":"ContainerStarted","Data":"26f3fcee8f14e8dc3766a04210a3e99200e320b3bbd3c7a18e1f0280f2e18941"} Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.943629 4909 generic.go:334] "Generic (PLEG): container finished" podID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerID="178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2" exitCode=0 Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.943843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" event={"ID":"cf0443bc-15f9-47c0-9105-a1acb3ff6998","Type":"ContainerDied","Data":"178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2"} Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.943916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" event={"ID":"cf0443bc-15f9-47c0-9105-a1acb3ff6998","Type":"ContainerDied","Data":"fa9182a132ad12da56516fcf8b1394d324535b8fbc23ed690bd320a6c3bb205e"} Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.943938 4909 scope.go:117] "RemoveContainer" containerID="178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.944350 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6nhsg" Oct 02 18:40:31 crc kubenswrapper[4909]: I1002 18:40:31.991690 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.002682 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-config\") pod \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.002826 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-sb\") pod \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.002897 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd5gj\" (UniqueName: \"kubernetes.io/projected/cf0443bc-15f9-47c0-9105-a1acb3ff6998-kube-api-access-wd5gj\") pod \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.002946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-nb\") pod \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.002992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-svc\") pod \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.003053 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-swift-storage-0\") pod \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\" (UID: \"cf0443bc-15f9-47c0-9105-a1acb3ff6998\") " Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.007271 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0443bc-15f9-47c0-9105-a1acb3ff6998-kube-api-access-wd5gj" (OuterVolumeSpecName: "kube-api-access-wd5gj") pod "cf0443bc-15f9-47c0-9105-a1acb3ff6998" (UID: "cf0443bc-15f9-47c0-9105-a1acb3ff6998"). InnerVolumeSpecName "kube-api-access-wd5gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.039251 4909 scope.go:117] "RemoveContainer" containerID="7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.080125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf0443bc-15f9-47c0-9105-a1acb3ff6998" (UID: "cf0443bc-15f9-47c0-9105-a1acb3ff6998"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.101836 4909 scope.go:117] "RemoveContainer" containerID="178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2" Oct 02 18:40:32 crc kubenswrapper[4909]: E1002 18:40:32.102323 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2\": container with ID starting with 178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2 not found: ID does not exist" containerID="178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.102370 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2"} err="failed to get container status \"178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2\": rpc error: code = NotFound desc = could not find container \"178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2\": container with ID starting with 178810c99dda33a54608b7954abcda4498242114d22e307f3ca418ab6f0406a2 not found: ID does not exist" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.102390 4909 scope.go:117] "RemoveContainer" containerID="7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d" Oct 02 18:40:32 crc kubenswrapper[4909]: E1002 18:40:32.102876 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d\": container with ID starting with 7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d not found: ID does not exist" containerID="7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.102915 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d"} err="failed to get container status \"7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d\": rpc error: code = NotFound desc = could not find container \"7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d\": container with ID starting with 7154a06e37ba27bfa100e1f26df8932f45f44ed8a184a666dbec0236c918672d not found: ID does not exist" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.105708 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd5gj\" (UniqueName: \"kubernetes.io/projected/cf0443bc-15f9-47c0-9105-a1acb3ff6998-kube-api-access-wd5gj\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.105726 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.116020 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-config" (OuterVolumeSpecName: "config") pod "cf0443bc-15f9-47c0-9105-a1acb3ff6998" (UID: "cf0443bc-15f9-47c0-9105-a1acb3ff6998"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.116579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf0443bc-15f9-47c0-9105-a1acb3ff6998" (UID: "cf0443bc-15f9-47c0-9105-a1acb3ff6998"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.119521 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf0443bc-15f9-47c0-9105-a1acb3ff6998" (UID: "cf0443bc-15f9-47c0-9105-a1acb3ff6998"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.131084 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf0443bc-15f9-47c0-9105-a1acb3ff6998" (UID: "cf0443bc-15f9-47c0-9105-a1acb3ff6998"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.209357 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.209394 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.209405 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.209413 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf0443bc-15f9-47c0-9105-a1acb3ff6998-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.321303 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6nhsg"] Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.334455 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6nhsg"] Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.954991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"237e5a0a-a767-48de-aa6b-bb8cd24fd570","Type":"ContainerStarted","Data":"2937fc922302c8de385c312b2d47440d06499054d59550782f10d9ae24c47e7e"} Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.955424 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.956595 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="cinder-scheduler" containerID="cri-o://e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4" gracePeriod=30 Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.956630 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="probe" containerID="cri-o://e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28" gracePeriod=30 Oct 02 18:40:32 crc kubenswrapper[4909]: I1002 18:40:32.980772 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.980756693 podStartE2EDuration="3.980756693s" podCreationTimestamp="2025-10-02 18:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:32.973435433 +0000 UTC m=+1354.160931292" watchObservedRunningTime="2025-10-02 18:40:32.980756693 +0000 UTC m=+1354.168252552" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.640109 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" path="/var/lib/kubelet/pods/cf0443bc-15f9-47c0-9105-a1acb3ff6998/volumes" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.766634 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.941489 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-combined-ca-bundle\") pod \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.941605 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfrhx\" (UniqueName: \"kubernetes.io/projected/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-kube-api-access-mfrhx\") pod \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.941662 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-ovndb-tls-certs\") pod \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.941875 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-config\") pod \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.941933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-httpd-config\") pod \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\" (UID: \"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6\") " Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.949244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" (UID: "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.950104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-kube-api-access-mfrhx" (OuterVolumeSpecName: "kube-api-access-mfrhx") pod "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" (UID: "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6"). InnerVolumeSpecName "kube-api-access-mfrhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.971040 4909 generic.go:334] "Generic (PLEG): container finished" podID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerID="5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de" exitCode=0 Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.971104 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fdf9896db-xbknn" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.971196 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fdf9896db-xbknn" event={"ID":"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6","Type":"ContainerDied","Data":"5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de"} Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.971313 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fdf9896db-xbknn" event={"ID":"8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6","Type":"ContainerDied","Data":"517ead285c1fccc390441a6407713289e89f099fc5c336bae3eb329c6faa0e7c"} Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.971390 4909 scope.go:117] "RemoveContainer" containerID="1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd" Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.975467 4909 generic.go:334] "Generic (PLEG): container finished" podID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerID="e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28" exitCode=0 Oct 02 18:40:33 crc kubenswrapper[4909]: I1002 18:40:33.976552 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"398fdf24-c388-4ddf-8489-99f9e9279e08","Type":"ContainerDied","Data":"e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28"} Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.016686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-config" (OuterVolumeSpecName: "config") pod "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" (UID: "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.020236 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" (UID: "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.041837 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" (UID: "8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.044216 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.044251 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.044265 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.044278 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfrhx\" (UniqueName: \"kubernetes.io/projected/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-kube-api-access-mfrhx\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.044291 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.091005 4909 scope.go:117] "RemoveContainer" containerID="5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.111378 4909 scope.go:117] "RemoveContainer" containerID="1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd" Oct 02 18:40:34 crc kubenswrapper[4909]: E1002 18:40:34.111768 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd\": container with ID starting with 1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd not found: ID does not exist" containerID="1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.111847 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd"} err="failed to get container status \"1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd\": rpc error: code = NotFound desc = could not find container \"1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd\": container with ID starting with 1d3927b3827eacfa330879aee50a447eb20c80ebc7289e73058d3a60e184c2fd not found: ID does not exist" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.111883 4909 scope.go:117] "RemoveContainer" containerID="5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de" Oct 02 18:40:34 crc kubenswrapper[4909]: E1002 18:40:34.112567 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de\": container with ID starting with 5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de not found: ID does not exist" containerID="5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.112590 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de"} err="failed to get container status \"5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de\": rpc error: code = NotFound desc = could not find container \"5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de\": container with ID starting with 5e0cc45e6814d606e7cfb80ce1112c559441741385367485cfc4d640654a52de not found: ID does not exist" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.350190 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fdf9896db-xbknn"] Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.384805 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fdf9896db-xbknn"] Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.837794 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.966979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-combined-ca-bundle\") pod \"398fdf24-c388-4ddf-8489-99f9e9279e08\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.967084 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/398fdf24-c388-4ddf-8489-99f9e9279e08-etc-machine-id\") pod \"398fdf24-c388-4ddf-8489-99f9e9279e08\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.967133 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-scripts\") pod \"398fdf24-c388-4ddf-8489-99f9e9279e08\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.967191 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqmbp\" (UniqueName: \"kubernetes.io/projected/398fdf24-c388-4ddf-8489-99f9e9279e08-kube-api-access-wqmbp\") pod \"398fdf24-c388-4ddf-8489-99f9e9279e08\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.967316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data\") pod \"398fdf24-c388-4ddf-8489-99f9e9279e08\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.967309 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/398fdf24-c388-4ddf-8489-99f9e9279e08-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "398fdf24-c388-4ddf-8489-99f9e9279e08" (UID: "398fdf24-c388-4ddf-8489-99f9e9279e08"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.967490 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data-custom\") pod \"398fdf24-c388-4ddf-8489-99f9e9279e08\" (UID: \"398fdf24-c388-4ddf-8489-99f9e9279e08\") " Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.968686 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/398fdf24-c388-4ddf-8489-99f9e9279e08-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.971555 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398fdf24-c388-4ddf-8489-99f9e9279e08-kube-api-access-wqmbp" (OuterVolumeSpecName: "kube-api-access-wqmbp") pod "398fdf24-c388-4ddf-8489-99f9e9279e08" (UID: "398fdf24-c388-4ddf-8489-99f9e9279e08"). InnerVolumeSpecName "kube-api-access-wqmbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.972592 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-scripts" (OuterVolumeSpecName: "scripts") pod "398fdf24-c388-4ddf-8489-99f9e9279e08" (UID: "398fdf24-c388-4ddf-8489-99f9e9279e08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:34 crc kubenswrapper[4909]: I1002 18:40:34.973414 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "398fdf24-c388-4ddf-8489-99f9e9279e08" (UID: "398fdf24-c388-4ddf-8489-99f9e9279e08"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.029526 4909 generic.go:334] "Generic (PLEG): container finished" podID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerID="e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4" exitCode=0 Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.029606 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"398fdf24-c388-4ddf-8489-99f9e9279e08","Type":"ContainerDied","Data":"e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4"} Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.029639 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"398fdf24-c388-4ddf-8489-99f9e9279e08","Type":"ContainerDied","Data":"dbfa4cabb1ecca0684a429d2823d860c7cafc61e89125d408f768ada7728ec38"} Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.029638 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.029675 4909 scope.go:117] "RemoveContainer" containerID="e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.033774 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "398fdf24-c388-4ddf-8489-99f9e9279e08" (UID: "398fdf24-c388-4ddf-8489-99f9e9279e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.070125 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.070349 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.070416 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqmbp\" (UniqueName: \"kubernetes.io/projected/398fdf24-c388-4ddf-8489-99f9e9279e08-kube-api-access-wqmbp\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.070479 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.111059 4909 scope.go:117] "RemoveContainer" containerID="e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.114229 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data" (OuterVolumeSpecName: "config-data") pod "398fdf24-c388-4ddf-8489-99f9e9279e08" (UID: "398fdf24-c388-4ddf-8489-99f9e9279e08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.136556 4909 scope.go:117] "RemoveContainer" containerID="e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.137044 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28\": container with ID starting with e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28 not found: ID does not exist" containerID="e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.137075 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28"} err="failed to get container status \"e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28\": rpc error: code = NotFound desc = could not find container \"e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28\": container with ID starting with e9caf7f5e175cce36d6b51690d2752c198d138d4a061c8f97f16cb00f5484f28 not found: ID does not exist" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.137096 4909 scope.go:117] "RemoveContainer" containerID="e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.137317 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4\": container with ID starting with e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4 not found: ID does not exist" containerID="e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.137339 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4"} err="failed to get container status \"e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4\": rpc error: code = NotFound desc = could not find container \"e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4\": container with ID starting with e7c35a7afbad5ebdf095af1e7459195008c52adc61eab87c647fdd7a9b4aa5e4 not found: ID does not exist" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.172566 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398fdf24-c388-4ddf-8489-99f9e9279e08-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.395564 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.418058 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.429434 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.430092 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerName="init" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430122 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerName="init" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.430166 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerName="dnsmasq-dns" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430179 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerName="dnsmasq-dns" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.430193 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="probe" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430207 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="probe" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.430244 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-api" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430257 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-api" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.430283 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="cinder-scheduler" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430294 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="cinder-scheduler" Oct 02 18:40:35 crc kubenswrapper[4909]: E1002 18:40:35.430322 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-httpd" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430412 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-httpd" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430763 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-api" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430792 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="cinder-scheduler" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430832 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" containerName="probe" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430857 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0443bc-15f9-47c0-9105-a1acb3ff6998" containerName="dnsmasq-dns" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.430880 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" containerName="neutron-httpd" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.443084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.443241 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.448086 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.580768 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.581086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-config-data\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.581282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d04288ee-974d-451f-8144-e981667f5115-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.581478 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlbwg\" (UniqueName: \"kubernetes.io/projected/d04288ee-974d-451f-8144-e981667f5115-kube-api-access-dlbwg\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.581623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-scripts\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.581743 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.619983 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398fdf24-c388-4ddf-8489-99f9e9279e08" path="/var/lib/kubelet/pods/398fdf24-c388-4ddf-8489-99f9e9279e08/volumes" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.620874 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6" path="/var/lib/kubelet/pods/8dec45f3-dd04-4d6b-8d1b-eceebc6ba0c6/volumes" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.683209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-scripts\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.683277 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.683359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.683427 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-config-data\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.683532 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d04288ee-974d-451f-8144-e981667f5115-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.683598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlbwg\" (UniqueName: \"kubernetes.io/projected/d04288ee-974d-451f-8144-e981667f5115-kube-api-access-dlbwg\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.684364 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d04288ee-974d-451f-8144-e981667f5115-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.689577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-config-data\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.690284 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.691828 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-scripts\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.705406 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04288ee-974d-451f-8144-e981667f5115-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.708713 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlbwg\" (UniqueName: \"kubernetes.io/projected/d04288ee-974d-451f-8144-e981667f5115-kube-api-access-dlbwg\") pod \"cinder-scheduler-0\" (UID: \"d04288ee-974d-451f-8144-e981667f5115\") " pod="openstack/cinder-scheduler-0" Oct 02 18:40:35 crc kubenswrapper[4909]: I1002 18:40:35.761219 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 18:40:36 crc kubenswrapper[4909]: I1002 18:40:36.234801 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 18:40:36 crc kubenswrapper[4909]: W1002 18:40:36.244247 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd04288ee_974d_451f_8144_e981667f5115.slice/crio-0afce93b90d08c197a5ffe169dc76b2277d0d5b902ec7d505f171a1b89fa59d9 WatchSource:0}: Error finding container 0afce93b90d08c197a5ffe169dc76b2277d0d5b902ec7d505f171a1b89fa59d9: Status 404 returned error can't find the container with id 0afce93b90d08c197a5ffe169dc76b2277d0d5b902ec7d505f171a1b89fa59d9 Oct 02 18:40:37 crc kubenswrapper[4909]: I1002 18:40:37.062370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d04288ee-974d-451f-8144-e981667f5115","Type":"ContainerStarted","Data":"8cd6510869515baa8f4a30e51a6d029a0d9997a5aba202205cec2eecdbf2615c"} Oct 02 18:40:37 crc kubenswrapper[4909]: I1002 18:40:37.062574 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d04288ee-974d-451f-8144-e981667f5115","Type":"ContainerStarted","Data":"0afce93b90d08c197a5ffe169dc76b2277d0d5b902ec7d505f171a1b89fa59d9"} Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.067840 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.076572 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d04288ee-974d-451f-8144-e981667f5115","Type":"ContainerStarted","Data":"92f34739dbadada383270443df3b278e7c0b9d6a6e5996fb66eb300a1787ad3a"} Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.119712 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.119692749 podStartE2EDuration="3.119692749s" podCreationTimestamp="2025-10-02 18:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:38.117084747 +0000 UTC m=+1359.304580596" watchObservedRunningTime="2025-10-02 18:40:38.119692749 +0000 UTC m=+1359.307188608" Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.166567 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b6849dc4b-wmr8p" Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.263851 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56bc8c7d98-gqjkp"] Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.264379 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56bc8c7d98-gqjkp" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api" containerID="cri-o://53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011" gracePeriod=30 Oct 02 18:40:38 crc kubenswrapper[4909]: I1002 18:40:38.264232 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56bc8c7d98-gqjkp" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api-log" containerID="cri-o://aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed" gracePeriod=30 Oct 02 18:40:39 crc kubenswrapper[4909]: I1002 18:40:39.087189 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1694f79-c022-44df-96d1-8eb2200685b4" containerID="aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed" exitCode=143 Oct 02 18:40:39 crc kubenswrapper[4909]: I1002 18:40:39.087386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bc8c7d98-gqjkp" event={"ID":"e1694f79-c022-44df-96d1-8eb2200685b4","Type":"ContainerDied","Data":"aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed"} Oct 02 18:40:40 crc kubenswrapper[4909]: I1002 18:40:40.762232 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 18:40:41 crc kubenswrapper[4909]: I1002 18:40:41.395250 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 18:40:41 crc kubenswrapper[4909]: I1002 18:40:41.483379 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56bc8c7d98-gqjkp" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": read tcp 10.217.0.2:36030->10.217.0.186:9311: read: connection reset by peer" Oct 02 18:40:41 crc kubenswrapper[4909]: I1002 18:40:41.485870 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56bc8c7d98-gqjkp" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": read tcp 10.217.0.2:36042->10.217.0.186:9311: read: connection reset by peer" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.037877 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.118870 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1694f79-c022-44df-96d1-8eb2200685b4" containerID="53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011" exitCode=0 Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.118914 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bc8c7d98-gqjkp" event={"ID":"e1694f79-c022-44df-96d1-8eb2200685b4","Type":"ContainerDied","Data":"53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011"} Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.118942 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bc8c7d98-gqjkp" event={"ID":"e1694f79-c022-44df-96d1-8eb2200685b4","Type":"ContainerDied","Data":"2f1e211c7c0388ef9e2f0268db552e23c099ce1742a623377e5002b7e5cec761"} Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.118958 4909 scope.go:117] "RemoveContainer" containerID="53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.119088 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bc8c7d98-gqjkp" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.125691 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-combined-ca-bundle\") pod \"e1694f79-c022-44df-96d1-8eb2200685b4\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.125840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxnr\" (UniqueName: \"kubernetes.io/projected/e1694f79-c022-44df-96d1-8eb2200685b4-kube-api-access-5hxnr\") pod \"e1694f79-c022-44df-96d1-8eb2200685b4\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.125907 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data-custom\") pod \"e1694f79-c022-44df-96d1-8eb2200685b4\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.125958 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1694f79-c022-44df-96d1-8eb2200685b4-logs\") pod \"e1694f79-c022-44df-96d1-8eb2200685b4\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.126008 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data\") pod \"e1694f79-c022-44df-96d1-8eb2200685b4\" (UID: \"e1694f79-c022-44df-96d1-8eb2200685b4\") " Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.127788 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1694f79-c022-44df-96d1-8eb2200685b4-logs" (OuterVolumeSpecName: "logs") pod "e1694f79-c022-44df-96d1-8eb2200685b4" (UID: "e1694f79-c022-44df-96d1-8eb2200685b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.132215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1694f79-c022-44df-96d1-8eb2200685b4" (UID: "e1694f79-c022-44df-96d1-8eb2200685b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.140160 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1694f79-c022-44df-96d1-8eb2200685b4-kube-api-access-5hxnr" (OuterVolumeSpecName: "kube-api-access-5hxnr") pod "e1694f79-c022-44df-96d1-8eb2200685b4" (UID: "e1694f79-c022-44df-96d1-8eb2200685b4"). InnerVolumeSpecName "kube-api-access-5hxnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.173484 4909 scope.go:117] "RemoveContainer" containerID="aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.175536 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1694f79-c022-44df-96d1-8eb2200685b4" (UID: "e1694f79-c022-44df-96d1-8eb2200685b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.177717 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data" (OuterVolumeSpecName: "config-data") pod "e1694f79-c022-44df-96d1-8eb2200685b4" (UID: "e1694f79-c022-44df-96d1-8eb2200685b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.228619 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.228657 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.228670 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxnr\" (UniqueName: \"kubernetes.io/projected/e1694f79-c022-44df-96d1-8eb2200685b4-kube-api-access-5hxnr\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.228678 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1694f79-c022-44df-96d1-8eb2200685b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.228686 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1694f79-c022-44df-96d1-8eb2200685b4-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.290622 4909 scope.go:117] "RemoveContainer" containerID="53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011" Oct 02 18:40:42 crc kubenswrapper[4909]: E1002 18:40:42.291484 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011\": container with ID starting with 53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011 not found: ID does not exist" containerID="53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.291517 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011"} err="failed to get container status \"53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011\": rpc error: code = NotFound desc = could not find container \"53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011\": container with ID starting with 53f210bdc6d0ac9f1396cbedbdd8e09f96c05f39ab862d8fa488a4ccc8468011 not found: ID does not exist" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.291537 4909 scope.go:117] "RemoveContainer" containerID="aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed" Oct 02 18:40:42 crc kubenswrapper[4909]: E1002 18:40:42.291889 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed\": container with ID starting with aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed not found: ID does not exist" containerID="aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.291931 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed"} err="failed to get container status \"aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed\": rpc error: code = NotFound desc = could not find container \"aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed\": container with ID starting with aee502820ed3993b5479d9529d1460f21a9de59e2715c2b61eab1ebf7f5d7bed not found: ID does not exist" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.444828 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8486f6d788-jp6q4" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.461252 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56bc8c7d98-gqjkp"] Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.469602 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56bc8c7d98-gqjkp"] Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.728335 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:42 crc kubenswrapper[4909]: I1002 18:40:42.746596 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57bc775dcb-fd69b" Oct 02 18:40:43 crc kubenswrapper[4909]: I1002 18:40:43.625579 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" path="/var/lib/kubelet/pods/e1694f79-c022-44df-96d1-8eb2200685b4/volumes" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.022595 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.853723 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.854220 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-central-agent" containerID="cri-o://527dc8844e54e0f8f0ed6ed08a9f0b4f208b35b31b919f7a62527bcdad085a51" gracePeriod=30 Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.854893 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="proxy-httpd" containerID="cri-o://aacfdd3a860934cb6abe4638540916d8e9fd8aacfa066e7bdd033cde19f97a37" gracePeriod=30 Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.854945 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="sg-core" containerID="cri-o://5d7a42c9551b8c804d2c17805805d7149da2bdc50ded0239161f6061ce008aba" gracePeriod=30 Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.854975 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-notification-agent" containerID="cri-o://ba2efee2ddc6b6c822e36a21c5d73a9636576b741d693e88c2b4db519ef01597" gracePeriod=30 Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.960230 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.190:3000/\": read tcp 10.217.0.2:38588->10.217.0.190:3000: read: connection reset by peer" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.998870 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5dc88f55df-4dlwb"] Oct 02 18:40:46 crc kubenswrapper[4909]: E1002 18:40:46.999394 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api-log" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.999419 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api-log" Oct 02 18:40:46 crc kubenswrapper[4909]: E1002 18:40:46.999431 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.999439 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.999704 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api" Oct 02 18:40:46 crc kubenswrapper[4909]: I1002 18:40:46.999745 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1694f79-c022-44df-96d1-8eb2200685b4" containerName="barbican-api-log" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.001246 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.010836 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.010879 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.011567 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.019709 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5dc88f55df-4dlwb"] Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.167522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/811ebdce-cef0-4178-836b-17bcdd164575-run-httpd\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.167599 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dshs\" (UniqueName: \"kubernetes.io/projected/811ebdce-cef0-4178-836b-17bcdd164575-kube-api-access-6dshs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.167806 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-public-tls-certs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.167895 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-config-data\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.167957 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/811ebdce-cef0-4178-836b-17bcdd164575-etc-swift\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.168076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-combined-ca-bundle\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.168166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-internal-tls-certs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.168355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/811ebdce-cef0-4178-836b-17bcdd164575-log-httpd\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.187807 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerID="aacfdd3a860934cb6abe4638540916d8e9fd8aacfa066e7bdd033cde19f97a37" exitCode=0 Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.187844 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerID="5d7a42c9551b8c804d2c17805805d7149da2bdc50ded0239161f6061ce008aba" exitCode=2 Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.187869 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerDied","Data":"aacfdd3a860934cb6abe4638540916d8e9fd8aacfa066e7bdd033cde19f97a37"} Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.187898 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerDied","Data":"5d7a42c9551b8c804d2c17805805d7149da2bdc50ded0239161f6061ce008aba"} Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271291 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/811ebdce-cef0-4178-836b-17bcdd164575-log-httpd\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271390 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/811ebdce-cef0-4178-836b-17bcdd164575-run-httpd\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dshs\" (UniqueName: \"kubernetes.io/projected/811ebdce-cef0-4178-836b-17bcdd164575-kube-api-access-6dshs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-public-tls-certs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-config-data\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/811ebdce-cef0-4178-836b-17bcdd164575-etc-swift\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-combined-ca-bundle\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271572 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-internal-tls-certs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.271801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/811ebdce-cef0-4178-836b-17bcdd164575-log-httpd\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.272629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/811ebdce-cef0-4178-836b-17bcdd164575-run-httpd\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.277950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-public-tls-certs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.278691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-combined-ca-bundle\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.280194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/811ebdce-cef0-4178-836b-17bcdd164575-etc-swift\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.280707 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-internal-tls-certs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.289916 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811ebdce-cef0-4178-836b-17bcdd164575-config-data\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.292268 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dshs\" (UniqueName: \"kubernetes.io/projected/811ebdce-cef0-4178-836b-17bcdd164575-kube-api-access-6dshs\") pod \"swift-proxy-5dc88f55df-4dlwb\" (UID: \"811ebdce-cef0-4178-836b-17bcdd164575\") " pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.420841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.479070 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.480337 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.482169 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-z5bqn" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.482604 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.482827 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.514339 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.577767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qllth\" (UniqueName: \"kubernetes.io/projected/92c039e9-5063-4c73-b775-b4a26f5cee35-kube-api-access-qllth\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.578385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config-secret\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.578445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-combined-ca-bundle\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.578484 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.680159 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config-secret\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.680241 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-combined-ca-bundle\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.680296 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.680375 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qllth\" (UniqueName: \"kubernetes.io/projected/92c039e9-5063-4c73-b775-b4a26f5cee35-kube-api-access-qllth\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.684183 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.687138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-combined-ca-bundle\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.687574 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config-secret\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.704273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qllth\" (UniqueName: \"kubernetes.io/projected/92c039e9-5063-4c73-b775-b4a26f5cee35-kube-api-access-qllth\") pod \"openstackclient\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.719895 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.720816 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.768205 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.796447 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.799371 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.821634 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:47 crc kubenswrapper[4909]: E1002 18:40:47.840268 4909 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 02 18:40:47 crc kubenswrapper[4909]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_92c039e9-5063-4c73-b775-b4a26f5cee35_0(0c24904ccb7b5127858a3d57d4df77c9c8d8ea460824b752120e17938c88f2f8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0c24904ccb7b5127858a3d57d4df77c9c8d8ea460824b752120e17938c88f2f8" Netns:"/var/run/netns/0f02f417-2058-450a-baa7-ac99df6751fa" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=0c24904ccb7b5127858a3d57d4df77c9c8d8ea460824b752120e17938c88f2f8;K8S_POD_UID=92c039e9-5063-4c73-b775-b4a26f5cee35" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/92c039e9-5063-4c73-b775-b4a26f5cee35]: expected pod UID "92c039e9-5063-4c73-b775-b4a26f5cee35" but got "a1aff9fe-ad1a-4056-a64c-8b83abf09d32" from Kube API Oct 02 18:40:47 crc kubenswrapper[4909]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 02 18:40:47 crc kubenswrapper[4909]: > Oct 02 18:40:47 crc kubenswrapper[4909]: E1002 18:40:47.840359 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 02 18:40:47 crc kubenswrapper[4909]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_92c039e9-5063-4c73-b775-b4a26f5cee35_0(0c24904ccb7b5127858a3d57d4df77c9c8d8ea460824b752120e17938c88f2f8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0c24904ccb7b5127858a3d57d4df77c9c8d8ea460824b752120e17938c88f2f8" Netns:"/var/run/netns/0f02f417-2058-450a-baa7-ac99df6751fa" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=0c24904ccb7b5127858a3d57d4df77c9c8d8ea460824b752120e17938c88f2f8;K8S_POD_UID=92c039e9-5063-4c73-b775-b4a26f5cee35" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/92c039e9-5063-4c73-b775-b4a26f5cee35]: expected pod UID "92c039e9-5063-4c73-b775-b4a26f5cee35" but got "a1aff9fe-ad1a-4056-a64c-8b83abf09d32" from Kube API Oct 02 18:40:47 crc kubenswrapper[4909]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 02 18:40:47 crc kubenswrapper[4909]: > pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.893200 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.893575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrztm\" (UniqueName: \"kubernetes.io/projected/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-kube-api-access-hrztm\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.893682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-openstack-config\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.893783 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-openstack-config-secret\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.995574 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrztm\" (UniqueName: \"kubernetes.io/projected/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-kube-api-access-hrztm\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.996332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-openstack-config\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.997767 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-openstack-config-secret\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.998235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:47 crc kubenswrapper[4909]: I1002 18:40:47.997695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-openstack-config\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.005390 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-openstack-config-secret\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.016523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.022673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrztm\" (UniqueName: \"kubernetes.io/projected/a1aff9fe-ad1a-4056-a64c-8b83abf09d32-kube-api-access-hrztm\") pod \"openstackclient\" (UID: \"a1aff9fe-ad1a-4056-a64c-8b83abf09d32\") " pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.056480 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5dc88f55df-4dlwb"] Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.133831 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.203760 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerID="527dc8844e54e0f8f0ed6ed08a9f0b4f208b35b31b919f7a62527bcdad085a51" exitCode=0 Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.203833 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerDied","Data":"527dc8844e54e0f8f0ed6ed08a9f0b4f208b35b31b919f7a62527bcdad085a51"} Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.205915 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.205917 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5dc88f55df-4dlwb" event={"ID":"811ebdce-cef0-4178-836b-17bcdd164575","Type":"ContainerStarted","Data":"55496b194daedb0166669c6fc089857c4d95546f3126e74599378f8c9f386197"} Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.227467 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="92c039e9-5063-4c73-b775-b4a26f5cee35" podUID="a1aff9fe-ad1a-4056-a64c-8b83abf09d32" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.228038 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.406733 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-combined-ca-bundle\") pod \"92c039e9-5063-4c73-b775-b4a26f5cee35\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.406879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config-secret\") pod \"92c039e9-5063-4c73-b775-b4a26f5cee35\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.406906 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config\") pod \"92c039e9-5063-4c73-b775-b4a26f5cee35\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.406972 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qllth\" (UniqueName: \"kubernetes.io/projected/92c039e9-5063-4c73-b775-b4a26f5cee35-kube-api-access-qllth\") pod \"92c039e9-5063-4c73-b775-b4a26f5cee35\" (UID: \"92c039e9-5063-4c73-b775-b4a26f5cee35\") " Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.407464 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "92c039e9-5063-4c73-b775-b4a26f5cee35" (UID: "92c039e9-5063-4c73-b775-b4a26f5cee35"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.410933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c039e9-5063-4c73-b775-b4a26f5cee35-kube-api-access-qllth" (OuterVolumeSpecName: "kube-api-access-qllth") pod "92c039e9-5063-4c73-b775-b4a26f5cee35" (UID: "92c039e9-5063-4c73-b775-b4a26f5cee35"). InnerVolumeSpecName "kube-api-access-qllth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.411180 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "92c039e9-5063-4c73-b775-b4a26f5cee35" (UID: "92c039e9-5063-4c73-b775-b4a26f5cee35"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.414283 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92c039e9-5063-4c73-b775-b4a26f5cee35" (UID: "92c039e9-5063-4c73-b775-b4a26f5cee35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.509079 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.509116 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92c039e9-5063-4c73-b775-b4a26f5cee35-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.509126 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qllth\" (UniqueName: \"kubernetes.io/projected/92c039e9-5063-4c73-b775-b4a26f5cee35-kube-api-access-qllth\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.509134 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c039e9-5063-4c73-b775-b4a26f5cee35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:48 crc kubenswrapper[4909]: I1002 18:40:48.673655 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 18:40:48 crc kubenswrapper[4909]: W1002 18:40:48.683448 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1aff9fe_ad1a_4056_a64c_8b83abf09d32.slice/crio-5ea49fc75a62419e6a5be7c4f775ca8b639b31eed00dab54adbc53c11dab8f1e WatchSource:0}: Error finding container 5ea49fc75a62419e6a5be7c4f775ca8b639b31eed00dab54adbc53c11dab8f1e: Status 404 returned error can't find the container with id 5ea49fc75a62419e6a5be7c4f775ca8b639b31eed00dab54adbc53c11dab8f1e Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.215531 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a1aff9fe-ad1a-4056-a64c-8b83abf09d32","Type":"ContainerStarted","Data":"5ea49fc75a62419e6a5be7c4f775ca8b639b31eed00dab54adbc53c11dab8f1e"} Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.217994 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.218008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5dc88f55df-4dlwb" event={"ID":"811ebdce-cef0-4178-836b-17bcdd164575","Type":"ContainerStarted","Data":"0becc811b0fbaa27ce456a553e392802858425662211c5eb59b0e4adb20251b2"} Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.218069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5dc88f55df-4dlwb" event={"ID":"811ebdce-cef0-4178-836b-17bcdd164575","Type":"ContainerStarted","Data":"00ee459db63617fc6874270c6ac72e7747809f21619c31c4705ac317107f821c"} Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.218354 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.218565 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.238984 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5dc88f55df-4dlwb" podStartSLOduration=3.23896783 podStartE2EDuration="3.23896783s" podCreationTimestamp="2025-10-02 18:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:40:49.234502559 +0000 UTC m=+1370.421998418" watchObservedRunningTime="2025-10-02 18:40:49.23896783 +0000 UTC m=+1370.426463689" Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.245355 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="92c039e9-5063-4c73-b775-b4a26f5cee35" podUID="a1aff9fe-ad1a-4056-a64c-8b83abf09d32" Oct 02 18:40:49 crc kubenswrapper[4909]: I1002 18:40:49.625163 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c039e9-5063-4c73-b775-b4a26f5cee35" path="/var/lib/kubelet/pods/92c039e9-5063-4c73-b775-b4a26f5cee35/volumes" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.247059 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerID="ba2efee2ddc6b6c822e36a21c5d73a9636576b741d693e88c2b4db519ef01597" exitCode=0 Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.247193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerDied","Data":"ba2efee2ddc6b6c822e36a21c5d73a9636576b741d693e88c2b4db519ef01597"} Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.440401 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471179 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-run-httpd\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471229 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndwf2\" (UniqueName: \"kubernetes.io/projected/3d6526c2-426e-4126-8692-7fc6066a4b4f-kube-api-access-ndwf2\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471287 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-combined-ca-bundle\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471367 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-scripts\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471385 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-log-httpd\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471412 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-config-data\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.471444 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-sg-core-conf-yaml\") pod \"3d6526c2-426e-4126-8692-7fc6066a4b4f\" (UID: \"3d6526c2-426e-4126-8692-7fc6066a4b4f\") " Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.472318 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.472741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.485288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6526c2-426e-4126-8692-7fc6066a4b4f-kube-api-access-ndwf2" (OuterVolumeSpecName: "kube-api-access-ndwf2") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "kube-api-access-ndwf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.486303 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-scripts" (OuterVolumeSpecName: "scripts") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.528170 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.572634 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.573421 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.573447 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.573457 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.573467 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6526c2-426e-4126-8692-7fc6066a4b4f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.573477 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndwf2\" (UniqueName: \"kubernetes.io/projected/3d6526c2-426e-4126-8692-7fc6066a4b4f-kube-api-access-ndwf2\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.573485 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.619264 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-config-data" (OuterVolumeSpecName: "config-data") pod "3d6526c2-426e-4126-8692-7fc6066a4b4f" (UID: "3d6526c2-426e-4126-8692-7fc6066a4b4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:40:51 crc kubenswrapper[4909]: I1002 18:40:51.675792 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6526c2-426e-4126-8692-7fc6066a4b4f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.265867 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6526c2-426e-4126-8692-7fc6066a4b4f","Type":"ContainerDied","Data":"604b195a320e33551e15582c6e3c40ab05f64758a4b3c89fd0d2148e31a1ea57"} Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.266254 4909 scope.go:117] "RemoveContainer" containerID="aacfdd3a860934cb6abe4638540916d8e9fd8aacfa066e7bdd033cde19f97a37" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.266127 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.305081 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.313502 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.321815 4909 scope.go:117] "RemoveContainer" containerID="5d7a42c9551b8c804d2c17805805d7149da2bdc50ded0239161f6061ce008aba" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.328632 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:52 crc kubenswrapper[4909]: E1002 18:40:52.329055 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="proxy-httpd" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329071 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="proxy-httpd" Oct 02 18:40:52 crc kubenswrapper[4909]: E1002 18:40:52.329106 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-central-agent" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329113 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-central-agent" Oct 02 18:40:52 crc kubenswrapper[4909]: E1002 18:40:52.329126 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="sg-core" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329132 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="sg-core" Oct 02 18:40:52 crc kubenswrapper[4909]: E1002 18:40:52.329149 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-notification-agent" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329156 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-notification-agent" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329337 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-central-agent" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329348 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="sg-core" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329362 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="ceilometer-notification-agent" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.329374 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" containerName="proxy-httpd" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.331068 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.333894 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.334125 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.365443 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.401538 4909 scope.go:117] "RemoveContainer" containerID="ba2efee2ddc6b6c822e36a21c5d73a9636576b741d693e88c2b4db519ef01597" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.435345 4909 scope.go:117] "RemoveContainer" containerID="527dc8844e54e0f8f0ed6ed08a9f0b4f208b35b31b919f7a62527bcdad085a51" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.501710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-scripts\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.502070 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-run-httpd\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.502165 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-config-data\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.502378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.502495 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-log-httpd\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.502551 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.502633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qz5\" (UniqueName: \"kubernetes.io/projected/dfeecec6-7723-4b63-9152-c9183e3f877d-kube-api-access-g5qz5\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604500 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qz5\" (UniqueName: \"kubernetes.io/projected/dfeecec6-7723-4b63-9152-c9183e3f877d-kube-api-access-g5qz5\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604542 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-scripts\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604597 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-run-httpd\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604629 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-config-data\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604738 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-log-httpd\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.604772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.606064 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-run-httpd\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.606960 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-log-httpd\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.611808 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-scripts\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.612200 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-config-data\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.612297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.612488 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.620478 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qz5\" (UniqueName: \"kubernetes.io/projected/dfeecec6-7723-4b63-9152-c9183e3f877d-kube-api-access-g5qz5\") pod \"ceilometer-0\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " pod="openstack/ceilometer-0" Oct 02 18:40:52 crc kubenswrapper[4909]: I1002 18:40:52.714760 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:40:53 crc kubenswrapper[4909]: I1002 18:40:53.054258 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:40:53 crc kubenswrapper[4909]: I1002 18:40:53.054563 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:40:53 crc kubenswrapper[4909]: I1002 18:40:53.627427 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6526c2-426e-4126-8692-7fc6066a4b4f" path="/var/lib/kubelet/pods/3d6526c2-426e-4126-8692-7fc6066a4b4f/volumes" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.068901 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-78c47d4bdf-9rk8w"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.070862 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.078618 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2f54j" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.078909 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.078926 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.079415 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data-custom\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.095098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.095590 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77hdh\" (UniqueName: \"kubernetes.io/projected/d77d6d3d-274d-46a3-b884-9acbd81526b2-kube-api-access-77hdh\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.095751 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-combined-ca-bundle\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.114117 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-78c47d4bdf-9rk8w"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.167965 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-d75nf"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.171132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.197374 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-d75nf"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199138 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77hdh\" (UniqueName: \"kubernetes.io/projected/d77d6d3d-274d-46a3-b884-9acbd81526b2-kube-api-access-77hdh\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199250 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942gv\" (UniqueName: \"kubernetes.io/projected/a4ec2d23-77a0-413f-a539-39098058c284-kube-api-access-942gv\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-combined-ca-bundle\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-config\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199342 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199439 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data-custom\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199456 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.199486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.212268 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-combined-ca-bundle\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.223205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.225781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data-custom\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.231089 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58db674fb5-5s8s5"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.232458 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.232632 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77hdh\" (UniqueName: \"kubernetes.io/projected/d77d6d3d-274d-46a3-b884-9acbd81526b2-kube-api-access-77hdh\") pod \"heat-engine-78c47d4bdf-9rk8w\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.235242 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.244612 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58db674fb5-5s8s5"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.260078 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-9f896796b-dt79t"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.261454 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.264678 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.280631 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9f896796b-dt79t"] Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302259 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9rhw\" (UniqueName: \"kubernetes.io/projected/20c64b6e-d451-4962-9793-532f6c31f79d-kube-api-access-x9rhw\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data-custom\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942gv\" (UniqueName: \"kubernetes.io/projected/a4ec2d23-77a0-413f-a539-39098058c284-kube-api-access-942gv\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-config\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-combined-ca-bundle\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302476 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data-custom\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whg68\" (UniqueName: \"kubernetes.io/projected/25396a7c-449c-4891-9f05-80acf9ef5309-kube-api-access-whg68\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-combined-ca-bundle\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302561 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.302631 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.304991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-config\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.305362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.305584 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.308654 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.308998 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.320666 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942gv\" (UniqueName: \"kubernetes.io/projected/a4ec2d23-77a0-413f-a539-39098058c284-kube-api-access-942gv\") pod \"dnsmasq-dns-f6bc4c6c9-d75nf\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data-custom\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whg68\" (UniqueName: \"kubernetes.io/projected/25396a7c-449c-4891-9f05-80acf9ef5309-kube-api-access-whg68\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-combined-ca-bundle\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404200 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9rhw\" (UniqueName: \"kubernetes.io/projected/20c64b6e-d451-4962-9793-532f6c31f79d-kube-api-access-x9rhw\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data-custom\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404347 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-combined-ca-bundle\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.404365 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.421543 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data-custom\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.442271 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-combined-ca-bundle\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.452130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.452709 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-combined-ca-bundle\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.457409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.458904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whg68\" (UniqueName: \"kubernetes.io/projected/25396a7c-449c-4891-9f05-80acf9ef5309-kube-api-access-whg68\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.459525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data-custom\") pod \"heat-api-9f896796b-dt79t\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.467725 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.485849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9rhw\" (UniqueName: \"kubernetes.io/projected/20c64b6e-d451-4962-9793-532f6c31f79d-kube-api-access-x9rhw\") pod \"heat-cfnapi-58db674fb5-5s8s5\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.515661 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.683516 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:40:55 crc kubenswrapper[4909]: I1002 18:40:55.713537 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:40:56 crc kubenswrapper[4909]: I1002 18:40:56.022428 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:40:57 crc kubenswrapper[4909]: I1002 18:40:57.431266 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:40:57 crc kubenswrapper[4909]: I1002 18:40:57.431543 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5dc88f55df-4dlwb" Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.295760 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-d75nf"] Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.336788 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58db674fb5-5s8s5"] Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.371572 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" event={"ID":"20c64b6e-d451-4962-9793-532f6c31f79d","Type":"ContainerStarted","Data":"fd60f3311043b3a323e0fef1299ddd71126dd82431c79d1c5ce1a978122eb073"} Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.372743 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" event={"ID":"a4ec2d23-77a0-413f-a539-39098058c284","Type":"ContainerStarted","Data":"1e3ed82e93d7adee4f815c3121ec6105b886bfe7a44928177d0c52c3e3b2dc5a"} Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.512572 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-78c47d4bdf-9rk8w"] Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.531322 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9f896796b-dt79t"] Oct 02 18:41:00 crc kubenswrapper[4909]: I1002 18:41:00.554955 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:00 crc kubenswrapper[4909]: W1002 18:41:00.585083 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfeecec6_7723_4b63_9152_c9183e3f877d.slice/crio-0324154a87c85a21308c54781fa3d8d2a713dfac6b59730ff922e7fb867d4177 WatchSource:0}: Error finding container 0324154a87c85a21308c54781fa3d8d2a713dfac6b59730ff922e7fb867d4177: Status 404 returned error can't find the container with id 0324154a87c85a21308c54781fa3d8d2a713dfac6b59730ff922e7fb867d4177 Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.415487 4909 generic.go:334] "Generic (PLEG): container finished" podID="a4ec2d23-77a0-413f-a539-39098058c284" containerID="a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974" exitCode=0 Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.416092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" event={"ID":"a4ec2d23-77a0-413f-a539-39098058c284","Type":"ContainerDied","Data":"a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974"} Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.421598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerStarted","Data":"0324154a87c85a21308c54781fa3d8d2a713dfac6b59730ff922e7fb867d4177"} Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.424175 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78c47d4bdf-9rk8w" event={"ID":"d77d6d3d-274d-46a3-b884-9acbd81526b2","Type":"ContainerStarted","Data":"1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2"} Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.424219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78c47d4bdf-9rk8w" event={"ID":"d77d6d3d-274d-46a3-b884-9acbd81526b2","Type":"ContainerStarted","Data":"c4fcb341ba35e6870e5064210058d8907997988361beb0ea5e5be7cdf95a8771"} Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.424983 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.428567 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a1aff9fe-ad1a-4056-a64c-8b83abf09d32","Type":"ContainerStarted","Data":"8fd4d7d0e1c211d7161da6f9b64e63546c2bc09f11b493a8a911148adab54d31"} Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.430769 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9f896796b-dt79t" event={"ID":"25396a7c-449c-4891-9f05-80acf9ef5309","Type":"ContainerStarted","Data":"628dd04662407a06e58c7208dc642af79fb1378b47c9aa185ebc8773c7b73065"} Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.481871 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-78c47d4bdf-9rk8w" podStartSLOduration=6.481847359 podStartE2EDuration="6.481847359s" podCreationTimestamp="2025-10-02 18:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:41:01.458743051 +0000 UTC m=+1382.646238910" watchObservedRunningTime="2025-10-02 18:41:01.481847359 +0000 UTC m=+1382.669343218" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.493152 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.9304522139999998 podStartE2EDuration="14.493130814s" podCreationTimestamp="2025-10-02 18:40:47 +0000 UTC" firstStartedPulling="2025-10-02 18:40:48.686489674 +0000 UTC m=+1369.873985533" lastFinishedPulling="2025-10-02 18:41:00.249168274 +0000 UTC m=+1381.436664133" observedRunningTime="2025-10-02 18:41:01.476978285 +0000 UTC m=+1382.664474144" watchObservedRunningTime="2025-10-02 18:41:01.493130814 +0000 UTC m=+1382.680626673" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.661256 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6cc9877f78-zbcsw"] Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.663254 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.671556 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7b47684999-l2rd2"] Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.678648 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.689287 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-8578895c89-hpjqt"] Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.691074 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.702088 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cc9877f78-zbcsw"] Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.712293 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8578895c89-hpjqt"] Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.725103 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b47684999-l2rd2"] Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.771140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data-custom\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.771186 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p97v\" (UniqueName: \"kubernetes.io/projected/df23b609-c698-4f30-a349-df1fef294948-kube-api-access-7p97v\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.771235 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.771323 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-combined-ca-bundle\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873466 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-combined-ca-bundle\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-combined-ca-bundle\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873596 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873617 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data-custom\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873634 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data-custom\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873703 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data-custom\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873724 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p97v\" (UniqueName: \"kubernetes.io/projected/df23b609-c698-4f30-a349-df1fef294948-kube-api-access-7p97v\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.873777 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j95qz\" (UniqueName: \"kubernetes.io/projected/c60eba96-3123-47ee-a374-50889549cc50-kube-api-access-j95qz\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.874129 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.874171 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-combined-ca-bundle\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.874569 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.874595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58grw\" (UniqueName: \"kubernetes.io/projected/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-kube-api-access-58grw\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.886806 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-combined-ca-bundle\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.887173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.887410 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data-custom\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.890007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p97v\" (UniqueName: \"kubernetes.io/projected/df23b609-c698-4f30-a349-df1fef294948-kube-api-access-7p97v\") pod \"heat-cfnapi-6cc9877f78-zbcsw\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976056 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j95qz\" (UniqueName: \"kubernetes.io/projected/c60eba96-3123-47ee-a374-50889549cc50-kube-api-access-j95qz\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976132 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-combined-ca-bundle\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976176 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58grw\" (UniqueName: \"kubernetes.io/projected/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-kube-api-access-58grw\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976296 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-combined-ca-bundle\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data-custom\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.976352 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data-custom\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.982114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.982807 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data-custom\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.982890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-combined-ca-bundle\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.983995 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.988638 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data-custom\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.988810 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-combined-ca-bundle\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.991109 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:01 crc kubenswrapper[4909]: I1002 18:41:01.993850 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j95qz\" (UniqueName: \"kubernetes.io/projected/c60eba96-3123-47ee-a374-50889549cc50-kube-api-access-j95qz\") pod \"heat-engine-7b47684999-l2rd2\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.013490 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58grw\" (UniqueName: \"kubernetes.io/projected/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-kube-api-access-58grw\") pod \"heat-api-8578895c89-hpjqt\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.015631 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.055532 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.446290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" event={"ID":"a4ec2d23-77a0-413f-a539-39098058c284","Type":"ContainerStarted","Data":"7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f"} Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.446781 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.448594 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerStarted","Data":"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf"} Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.472073 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" podStartSLOduration=7.472015623 podStartE2EDuration="7.472015623s" podCreationTimestamp="2025-10-02 18:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:41:02.465794767 +0000 UTC m=+1383.653290626" watchObservedRunningTime="2025-10-02 18:41:02.472015623 +0000 UTC m=+1383.659511472" Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.552653 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cc9877f78-zbcsw"] Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.566499 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b47684999-l2rd2"] Oct 02 18:41:02 crc kubenswrapper[4909]: I1002 18:41:02.719190 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8578895c89-hpjqt"] Oct 02 18:41:02 crc kubenswrapper[4909]: W1002 18:41:02.856264 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode918212c_fc0d_49e8_b1d0_c4fb7e1ce7bf.slice/crio-86e684c585bc150e66534991573a56b7de239aeccec7a3f86fffa6ceb2cd5aa6 WatchSource:0}: Error finding container 86e684c585bc150e66534991573a56b7de239aeccec7a3f86fffa6ceb2cd5aa6: Status 404 returned error can't find the container with id 86e684c585bc150e66534991573a56b7de239aeccec7a3f86fffa6ceb2cd5aa6 Oct 02 18:41:02 crc kubenswrapper[4909]: W1002 18:41:02.864807 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc60eba96_3123_47ee_a374_50889549cc50.slice/crio-62119b083aa0aaee85af6a4d407e3d83076abd6d87d1f56fc45da0c08cb77429 WatchSource:0}: Error finding container 62119b083aa0aaee85af6a4d407e3d83076abd6d87d1f56fc45da0c08cb77429: Status 404 returned error can't find the container with id 62119b083aa0aaee85af6a4d407e3d83076abd6d87d1f56fc45da0c08cb77429 Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.213855 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9f896796b-dt79t"] Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.258158 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58db674fb5-5s8s5"] Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.313508 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5d74c74f69-9pjww"] Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.330157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.336753 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.342941 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.344801 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5d74c74f69-9pjww"] Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.367812 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5446d8bcf6-rnrrj"] Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.369509 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.371940 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.372390 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.380257 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5446d8bcf6-rnrrj"] Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.472992 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" event={"ID":"df23b609-c698-4f30-a349-df1fef294948","Type":"ContainerStarted","Data":"9d19c47d3e080e386c200eb5b1f4567b117324df3aeed824cda0b001c77db7da"} Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.476075 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b47684999-l2rd2" event={"ID":"c60eba96-3123-47ee-a374-50889549cc50","Type":"ContainerStarted","Data":"62119b083aa0aaee85af6a4d407e3d83076abd6d87d1f56fc45da0c08cb77429"} Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.477733 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8578895c89-hpjqt" event={"ID":"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf","Type":"ContainerStarted","Data":"86e684c585bc150e66534991573a56b7de239aeccec7a3f86fffa6ceb2cd5aa6"} Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data-custom\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517429 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrz4\" (UniqueName: \"kubernetes.io/projected/e56e6356-0b90-4570-8068-d341cf2c7b50-kube-api-access-swrz4\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517473 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-internal-tls-certs\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517498 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-public-tls-certs\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-internal-tls-certs\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517547 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-combined-ca-bundle\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517570 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data-custom\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517590 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-public-tls-certs\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517643 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517658 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45m2\" (UniqueName: \"kubernetes.io/projected/d90f5cd7-6d13-41b6-8c6f-86121b523321-kube-api-access-w45m2\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.517680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-combined-ca-bundle\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45m2\" (UniqueName: \"kubernetes.io/projected/d90f5cd7-6d13-41b6-8c6f-86121b523321-kube-api-access-w45m2\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619444 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-combined-ca-bundle\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619547 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data-custom\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619601 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrz4\" (UniqueName: \"kubernetes.io/projected/e56e6356-0b90-4570-8068-d341cf2c7b50-kube-api-access-swrz4\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619660 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-internal-tls-certs\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-public-tls-certs\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619731 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-internal-tls-certs\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-combined-ca-bundle\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data-custom\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-public-tls-certs\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.619816 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.626059 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-combined-ca-bundle\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.627271 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-internal-tls-certs\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.627288 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.630566 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-internal-tls-certs\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.631341 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-public-tls-certs\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.633651 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-public-tls-certs\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.633874 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-combined-ca-bundle\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.634193 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.634325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data-custom\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.634726 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data-custom\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.646050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrz4\" (UniqueName: \"kubernetes.io/projected/e56e6356-0b90-4570-8068-d341cf2c7b50-kube-api-access-swrz4\") pod \"heat-cfnapi-5446d8bcf6-rnrrj\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.646409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45m2\" (UniqueName: \"kubernetes.io/projected/d90f5cd7-6d13-41b6-8c6f-86121b523321-kube-api-access-w45m2\") pod \"heat-api-5d74c74f69-9pjww\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.655559 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:03 crc kubenswrapper[4909]: I1002 18:41:03.687740 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.292183 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5d74c74f69-9pjww"] Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.490081 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" event={"ID":"df23b609-c698-4f30-a349-df1fef294948","Type":"ContainerStarted","Data":"8f4e7f2d55d8dcc812b7ffded0470495f6e782bf078cabd74feedbc038cdef01"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.491410 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.491718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b47684999-l2rd2" event={"ID":"c60eba96-3123-47ee-a374-50889549cc50","Type":"ContainerStarted","Data":"66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.491803 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.494535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerStarted","Data":"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.504366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8578895c89-hpjqt" event={"ID":"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf","Type":"ContainerStarted","Data":"94fd0b5a757b97f388249efea3b2cd9c36efcda0d5cb740f595077d0023a6948"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.504629 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.505909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5d74c74f69-9pjww" event={"ID":"d90f5cd7-6d13-41b6-8c6f-86121b523321","Type":"ContainerStarted","Data":"11783afa0682a494e1319772cadd242637802d6194ad694d2550f71e6ef16f0d"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.510206 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" podStartSLOduration=2.484847381 podStartE2EDuration="3.510195873s" podCreationTimestamp="2025-10-02 18:41:01 +0000 UTC" firstStartedPulling="2025-10-02 18:41:02.863768395 +0000 UTC m=+1384.051264254" lastFinishedPulling="2025-10-02 18:41:03.889116887 +0000 UTC m=+1385.076612746" observedRunningTime="2025-10-02 18:41:04.507470888 +0000 UTC m=+1385.694966757" watchObservedRunningTime="2025-10-02 18:41:04.510195873 +0000 UTC m=+1385.697691732" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.510986 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" event={"ID":"20c64b6e-d451-4962-9793-532f6c31f79d","Type":"ContainerStarted","Data":"a9c7e2dcc9cfc4a50071740d1912151f9e585eb62535a42a7a72094b1eadd883"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.511180 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" podUID="20c64b6e-d451-4962-9793-532f6c31f79d" containerName="heat-cfnapi" containerID="cri-o://a9c7e2dcc9cfc4a50071740d1912151f9e585eb62535a42a7a72094b1eadd883" gracePeriod=60 Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.511472 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.516070 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9f896796b-dt79t" event={"ID":"25396a7c-449c-4891-9f05-80acf9ef5309","Type":"ContainerStarted","Data":"21d99a2ba58de400adfed2c828e3916fd93dc568857260758f92538f615cb8c5"} Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.516236 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-9f896796b-dt79t" podUID="25396a7c-449c-4891-9f05-80acf9ef5309" containerName="heat-api" containerID="cri-o://21d99a2ba58de400adfed2c828e3916fd93dc568857260758f92538f615cb8c5" gracePeriod=60 Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.516330 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.533596 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7b47684999-l2rd2" podStartSLOduration=3.533574181 podStartE2EDuration="3.533574181s" podCreationTimestamp="2025-10-02 18:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:41:04.525228387 +0000 UTC m=+1385.712724246" watchObservedRunningTime="2025-10-02 18:41:04.533574181 +0000 UTC m=+1385.721070040" Oct 02 18:41:04 crc kubenswrapper[4909]: W1002 18:41:04.545048 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode56e6356_0b90_4570_8068_d341cf2c7b50.slice/crio-4de51374614b9836c11ce7f131f4827e9cc9b5b703738f4dab4db493f35eb6c9 WatchSource:0}: Error finding container 4de51374614b9836c11ce7f131f4827e9cc9b5b703738f4dab4db493f35eb6c9: Status 404 returned error can't find the container with id 4de51374614b9836c11ce7f131f4827e9cc9b5b703738f4dab4db493f35eb6c9 Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.550129 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5446d8bcf6-rnrrj"] Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.556654 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-8578895c89-hpjqt" podStartSLOduration=2.526919066 podStartE2EDuration="3.556631096s" podCreationTimestamp="2025-10-02 18:41:01 +0000 UTC" firstStartedPulling="2025-10-02 18:41:02.864952972 +0000 UTC m=+1384.052448831" lastFinishedPulling="2025-10-02 18:41:03.894665002 +0000 UTC m=+1385.082160861" observedRunningTime="2025-10-02 18:41:04.542793531 +0000 UTC m=+1385.730289400" watchObservedRunningTime="2025-10-02 18:41:04.556631096 +0000 UTC m=+1385.744126965" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.571861 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" podStartSLOduration=6.111991657 podStartE2EDuration="9.571834586s" podCreationTimestamp="2025-10-02 18:40:55 +0000 UTC" firstStartedPulling="2025-10-02 18:41:00.352083036 +0000 UTC m=+1381.539578885" lastFinishedPulling="2025-10-02 18:41:03.811925955 +0000 UTC m=+1384.999421814" observedRunningTime="2025-10-02 18:41:04.558712132 +0000 UTC m=+1385.746207991" watchObservedRunningTime="2025-10-02 18:41:04.571834586 +0000 UTC m=+1385.759330465" Oct 02 18:41:04 crc kubenswrapper[4909]: I1002 18:41:04.589632 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-9f896796b-dt79t" podStartSLOduration=6.271854953 podStartE2EDuration="9.589608625s" podCreationTimestamp="2025-10-02 18:40:55 +0000 UTC" firstStartedPulling="2025-10-02 18:41:00.523631232 +0000 UTC m=+1381.711127091" lastFinishedPulling="2025-10-02 18:41:03.841384904 +0000 UTC m=+1385.028880763" observedRunningTime="2025-10-02 18:41:04.572630791 +0000 UTC m=+1385.760126660" watchObservedRunningTime="2025-10-02 18:41:04.589608625 +0000 UTC m=+1385.777104484" Oct 02 18:41:05 crc kubenswrapper[4909]: I1002 18:41:05.537279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" event={"ID":"e56e6356-0b90-4570-8068-d341cf2c7b50","Type":"ContainerStarted","Data":"53c16e2793e47e764f80056293ee032fca975d5f5405993af1861bfd23a216f0"} Oct 02 18:41:05 crc kubenswrapper[4909]: I1002 18:41:05.537582 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" event={"ID":"e56e6356-0b90-4570-8068-d341cf2c7b50","Type":"ContainerStarted","Data":"4de51374614b9836c11ce7f131f4827e9cc9b5b703738f4dab4db493f35eb6c9"} Oct 02 18:41:05 crc kubenswrapper[4909]: I1002 18:41:05.537759 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:05 crc kubenswrapper[4909]: I1002 18:41:05.542916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5d74c74f69-9pjww" event={"ID":"d90f5cd7-6d13-41b6-8c6f-86121b523321","Type":"ContainerStarted","Data":"b01d645485bba5753eb015244c7bb356a5cfcecb4ac8d8193374c4b5fdd50109"} Oct 02 18:41:05 crc kubenswrapper[4909]: I1002 18:41:05.557049 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" podStartSLOduration=2.557013023 podStartE2EDuration="2.557013023s" podCreationTimestamp="2025-10-02 18:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:41:05.554755912 +0000 UTC m=+1386.742251781" watchObservedRunningTime="2025-10-02 18:41:05.557013023 +0000 UTC m=+1386.744508882" Oct 02 18:41:05 crc kubenswrapper[4909]: I1002 18:41:05.622953 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5d74c74f69-9pjww" podStartSLOduration=2.622925409 podStartE2EDuration="2.622925409s" podCreationTimestamp="2025-10-02 18:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:41:05.576232569 +0000 UTC m=+1386.763728428" watchObservedRunningTime="2025-10-02 18:41:05.622925409 +0000 UTC m=+1386.810421278" Oct 02 18:41:06 crc kubenswrapper[4909]: I1002 18:41:06.561276 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:07 crc kubenswrapper[4909]: I1002 18:41:07.571054 4909 generic.go:334] "Generic (PLEG): container finished" podID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerID="94fd0b5a757b97f388249efea3b2cd9c36efcda0d5cb740f595077d0023a6948" exitCode=1 Oct 02 18:41:07 crc kubenswrapper[4909]: I1002 18:41:07.571273 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8578895c89-hpjqt" event={"ID":"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf","Type":"ContainerDied","Data":"94fd0b5a757b97f388249efea3b2cd9c36efcda0d5cb740f595077d0023a6948"} Oct 02 18:41:07 crc kubenswrapper[4909]: I1002 18:41:07.572265 4909 scope.go:117] "RemoveContainer" containerID="94fd0b5a757b97f388249efea3b2cd9c36efcda0d5cb740f595077d0023a6948" Oct 02 18:41:07 crc kubenswrapper[4909]: I1002 18:41:07.575910 4909 generic.go:334] "Generic (PLEG): container finished" podID="df23b609-c698-4f30-a349-df1fef294948" containerID="8f4e7f2d55d8dcc812b7ffded0470495f6e782bf078cabd74feedbc038cdef01" exitCode=1 Oct 02 18:41:07 crc kubenswrapper[4909]: I1002 18:41:07.575985 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" event={"ID":"df23b609-c698-4f30-a349-df1fef294948","Type":"ContainerDied","Data":"8f4e7f2d55d8dcc812b7ffded0470495f6e782bf078cabd74feedbc038cdef01"} Oct 02 18:41:07 crc kubenswrapper[4909]: I1002 18:41:07.576738 4909 scope.go:117] "RemoveContainer" containerID="8f4e7f2d55d8dcc812b7ffded0470495f6e782bf078cabd74feedbc038cdef01" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.587708 4909 generic.go:334] "Generic (PLEG): container finished" podID="df23b609-c698-4f30-a349-df1fef294948" containerID="6ac2de9c4a28a05339d86a095b8954b39f8e09f488ed8d3d86a4c658247135e7" exitCode=1 Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.587776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" event={"ID":"df23b609-c698-4f30-a349-df1fef294948","Type":"ContainerDied","Data":"6ac2de9c4a28a05339d86a095b8954b39f8e09f488ed8d3d86a4c658247135e7"} Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.588057 4909 scope.go:117] "RemoveContainer" containerID="8f4e7f2d55d8dcc812b7ffded0470495f6e782bf078cabd74feedbc038cdef01" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.588759 4909 scope.go:117] "RemoveContainer" containerID="6ac2de9c4a28a05339d86a095b8954b39f8e09f488ed8d3d86a4c658247135e7" Oct 02 18:41:08 crc kubenswrapper[4909]: E1002 18:41:08.588973 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6cc9877f78-zbcsw_openstack(df23b609-c698-4f30-a349-df1fef294948)\"" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" podUID="df23b609-c698-4f30-a349-df1fef294948" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.596454 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerStarted","Data":"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a"} Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.598312 4909 generic.go:334] "Generic (PLEG): container finished" podID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerID="b453517070d4009aa1a49bc59c905d17cc110819691b45b409bfa5790dafd40c" exitCode=1 Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.598344 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8578895c89-hpjqt" event={"ID":"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf","Type":"ContainerDied","Data":"b453517070d4009aa1a49bc59c905d17cc110819691b45b409bfa5790dafd40c"} Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.599000 4909 scope.go:117] "RemoveContainer" containerID="b453517070d4009aa1a49bc59c905d17cc110819691b45b409bfa5790dafd40c" Oct 02 18:41:08 crc kubenswrapper[4909]: E1002 18:41:08.599219 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-8578895c89-hpjqt_openstack(e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf)\"" pod="openstack/heat-api-8578895c89-hpjqt" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.655103 4909 scope.go:117] "RemoveContainer" containerID="94fd0b5a757b97f388249efea3b2cd9c36efcda0d5cb740f595077d0023a6948" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.710931 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8c5jp"] Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.712318 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.801078 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8c5jp"] Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.856603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z68f\" (UniqueName: \"kubernetes.io/projected/930a4d62-49c6-4a03-87d1-a91e08c5f01f-kube-api-access-2z68f\") pod \"nova-api-db-create-8c5jp\" (UID: \"930a4d62-49c6-4a03-87d1-a91e08c5f01f\") " pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.962354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z68f\" (UniqueName: \"kubernetes.io/projected/930a4d62-49c6-4a03-87d1-a91e08c5f01f-kube-api-access-2z68f\") pod \"nova-api-db-create-8c5jp\" (UID: \"930a4d62-49c6-4a03-87d1-a91e08c5f01f\") " pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.968090 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zcxfk"] Oct 02 18:41:08 crc kubenswrapper[4909]: I1002 18:41:08.969640 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.023133 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zcxfk"] Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.032734 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z68f\" (UniqueName: \"kubernetes.io/projected/930a4d62-49c6-4a03-87d1-a91e08c5f01f-kube-api-access-2z68f\") pod \"nova-api-db-create-8c5jp\" (UID: \"930a4d62-49c6-4a03-87d1-a91e08c5f01f\") " pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.062583 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-r29fp"] Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.064766 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.074326 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fglqq\" (UniqueName: \"kubernetes.io/projected/a41e0c4c-75a5-47e0-8e07-4b851ff9feda-kube-api-access-fglqq\") pod \"nova-cell0-db-create-zcxfk\" (UID: \"a41e0c4c-75a5-47e0-8e07-4b851ff9feda\") " pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.087229 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.113343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-r29fp"] Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.177572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqpdc\" (UniqueName: \"kubernetes.io/projected/c4a1121e-8850-4ae8-9f23-058732d8cf96-kube-api-access-qqpdc\") pod \"nova-cell1-db-create-r29fp\" (UID: \"c4a1121e-8850-4ae8-9f23-058732d8cf96\") " pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.177739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fglqq\" (UniqueName: \"kubernetes.io/projected/a41e0c4c-75a5-47e0-8e07-4b851ff9feda-kube-api-access-fglqq\") pod \"nova-cell0-db-create-zcxfk\" (UID: \"a41e0c4c-75a5-47e0-8e07-4b851ff9feda\") " pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.195450 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fglqq\" (UniqueName: \"kubernetes.io/projected/a41e0c4c-75a5-47e0-8e07-4b851ff9feda-kube-api-access-fglqq\") pod \"nova-cell0-db-create-zcxfk\" (UID: \"a41e0c4c-75a5-47e0-8e07-4b851ff9feda\") " pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.280942 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqpdc\" (UniqueName: \"kubernetes.io/projected/c4a1121e-8850-4ae8-9f23-058732d8cf96-kube-api-access-qqpdc\") pod \"nova-cell1-db-create-r29fp\" (UID: \"c4a1121e-8850-4ae8-9f23-058732d8cf96\") " pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.312661 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqpdc\" (UniqueName: \"kubernetes.io/projected/c4a1121e-8850-4ae8-9f23-058732d8cf96-kube-api-access-qqpdc\") pod \"nova-cell1-db-create-r29fp\" (UID: \"c4a1121e-8850-4ae8-9f23-058732d8cf96\") " pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.423651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.448867 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.661663 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerStarted","Data":"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56"} Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.661875 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-central-agent" containerID="cri-o://f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" gracePeriod=30 Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.662174 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.662449 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="proxy-httpd" containerID="cri-o://7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" gracePeriod=30 Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.662492 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="sg-core" containerID="cri-o://c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" gracePeriod=30 Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.662546 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-notification-agent" containerID="cri-o://bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" gracePeriod=30 Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.666462 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8c5jp"] Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.690127 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.435112226 podStartE2EDuration="17.690110651s" podCreationTimestamp="2025-10-02 18:40:52 +0000 UTC" firstStartedPulling="2025-10-02 18:41:00.685765659 +0000 UTC m=+1381.873261518" lastFinishedPulling="2025-10-02 18:41:08.940764084 +0000 UTC m=+1390.128259943" observedRunningTime="2025-10-02 18:41:09.687583612 +0000 UTC m=+1390.875079471" watchObservedRunningTime="2025-10-02 18:41:09.690110651 +0000 UTC m=+1390.877606510" Oct 02 18:41:09 crc kubenswrapper[4909]: I1002 18:41:09.986050 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zcxfk"] Oct 02 18:41:10 crc kubenswrapper[4909]: W1002 18:41:10.005571 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda41e0c4c_75a5_47e0_8e07_4b851ff9feda.slice/crio-86d210b0cb20741fc05485eab31fa985c44ee518f88b18440a585c1cce1ba718 WatchSource:0}: Error finding container 86d210b0cb20741fc05485eab31fa985c44ee518f88b18440a585c1cce1ba718: Status 404 returned error can't find the container with id 86d210b0cb20741fc05485eab31fa985c44ee518f88b18440a585c1cce1ba718 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.117890 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-r29fp"] Oct 02 18:41:10 crc kubenswrapper[4909]: W1002 18:41:10.201525 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4a1121e_8850_4ae8_9f23_058732d8cf96.slice/crio-bdf4903ba9438504766588e673b55308a86f32a687918d4ec53073d78c6565eb WatchSource:0}: Error finding container bdf4903ba9438504766588e673b55308a86f32a687918d4ec53073d78c6565eb: Status 404 returned error can't find the container with id bdf4903ba9438504766588e673b55308a86f32a687918d4ec53073d78c6565eb Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.512143 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.517166 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531677 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-combined-ca-bundle\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-sg-core-conf-yaml\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531730 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-log-httpd\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531771 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-scripts\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-config-data\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531823 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-run-httpd\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.531854 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5qz5\" (UniqueName: \"kubernetes.io/projected/dfeecec6-7723-4b63-9152-c9183e3f877d-kube-api-access-g5qz5\") pod \"dfeecec6-7723-4b63-9152-c9183e3f877d\" (UID: \"dfeecec6-7723-4b63-9152-c9183e3f877d\") " Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.537800 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.538495 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.565569 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-scripts" (OuterVolumeSpecName: "scripts") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.579473 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfeecec6-7723-4b63-9152-c9183e3f877d-kube-api-access-g5qz5" (OuterVolumeSpecName: "kube-api-access-g5qz5") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "kube-api-access-g5qz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.633169 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-pqdnn"] Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.633429 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="dnsmasq-dns" containerID="cri-o://6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03" gracePeriod=10 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.641876 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.642772 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.642807 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.642816 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfeecec6-7723-4b63-9152-c9183e3f877d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.642826 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5qz5\" (UniqueName: \"kubernetes.io/projected/dfeecec6-7723-4b63-9152-c9183e3f877d-kube-api-access-g5qz5\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.718071 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.719179 4909 generic.go:334] "Generic (PLEG): container finished" podID="c4a1121e-8850-4ae8-9f23-058732d8cf96" containerID="53aeb06a6630555f39a15cbb85b64809fcc92f946ae25b0709a9a8c2a9776ce3" exitCode=0 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.719278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r29fp" event={"ID":"c4a1121e-8850-4ae8-9f23-058732d8cf96","Type":"ContainerDied","Data":"53aeb06a6630555f39a15cbb85b64809fcc92f946ae25b0709a9a8c2a9776ce3"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.719314 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r29fp" event={"ID":"c4a1121e-8850-4ae8-9f23-058732d8cf96","Type":"ContainerStarted","Data":"bdf4903ba9438504766588e673b55308a86f32a687918d4ec53073d78c6565eb"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.748302 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.748344 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.749537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-config-data" (OuterVolumeSpecName: "config-data") pod "dfeecec6-7723-4b63-9152-c9183e3f877d" (UID: "dfeecec6-7723-4b63-9152-c9183e3f877d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751394 4909 generic.go:334] "Generic (PLEG): container finished" podID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" exitCode=0 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751429 4909 generic.go:334] "Generic (PLEG): container finished" podID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" exitCode=2 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751437 4909 generic.go:334] "Generic (PLEG): container finished" podID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" exitCode=0 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751445 4909 generic.go:334] "Generic (PLEG): container finished" podID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" exitCode=0 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751498 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerDied","Data":"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751543 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerDied","Data":"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerDied","Data":"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751566 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerDied","Data":"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfeecec6-7723-4b63-9152-c9183e3f877d","Type":"ContainerDied","Data":"0324154a87c85a21308c54781fa3d8d2a713dfac6b59730ff922e7fb867d4177"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751590 4909 scope.go:117] "RemoveContainer" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.751706 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.772680 4909 generic.go:334] "Generic (PLEG): container finished" podID="930a4d62-49c6-4a03-87d1-a91e08c5f01f" containerID="c3bcdcd1d51962bce660a6cbcabd71ff46ae088232d092070aafb299e543cba2" exitCode=0 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.772756 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8c5jp" event={"ID":"930a4d62-49c6-4a03-87d1-a91e08c5f01f","Type":"ContainerDied","Data":"c3bcdcd1d51962bce660a6cbcabd71ff46ae088232d092070aafb299e543cba2"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.772788 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8c5jp" event={"ID":"930a4d62-49c6-4a03-87d1-a91e08c5f01f","Type":"ContainerStarted","Data":"84bcdf27943b19cce2867504d511fb462156a70a5ac2db496fc9d2838ec106a3"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.791735 4909 generic.go:334] "Generic (PLEG): container finished" podID="a41e0c4c-75a5-47e0-8e07-4b851ff9feda" containerID="d9afede484450c6c68a142bc799051644f2f72f742b4d07ed72f4d21f66d438b" exitCode=0 Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.791785 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zcxfk" event={"ID":"a41e0c4c-75a5-47e0-8e07-4b851ff9feda","Type":"ContainerDied","Data":"d9afede484450c6c68a142bc799051644f2f72f742b4d07ed72f4d21f66d438b"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.791810 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zcxfk" event={"ID":"a41e0c4c-75a5-47e0-8e07-4b851ff9feda","Type":"ContainerStarted","Data":"86d210b0cb20741fc05485eab31fa985c44ee518f88b18440a585c1cce1ba718"} Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.849580 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfeecec6-7723-4b63-9152-c9183e3f877d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.956998 4909 scope.go:117] "RemoveContainer" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" Oct 02 18:41:10 crc kubenswrapper[4909]: I1002 18:41:10.993915 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.016738 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.031930 4909 scope.go:117] "RemoveContainer" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.037660 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.038146 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-central-agent" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038165 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-central-agent" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.038185 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="sg-core" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038191 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="sg-core" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.038210 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-notification-agent" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038216 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-notification-agent" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.038230 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="proxy-httpd" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038236 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="proxy-httpd" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038420 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-central-agent" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038448 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="ceilometer-notification-agent" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038463 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="proxy-httpd" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.038476 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" containerName="sg-core" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.040403 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.042559 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.042733 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.046957 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.060741 4909 scope.go:117] "RemoveContainer" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.083812 4909 scope.go:117] "RemoveContainer" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.084194 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": container with ID starting with 7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56 not found: ID does not exist" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.084222 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56"} err="failed to get container status \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": rpc error: code = NotFound desc = could not find container \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": container with ID starting with 7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56 not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.084242 4909 scope.go:117] "RemoveContainer" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.085380 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": container with ID starting with c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a not found: ID does not exist" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.085403 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a"} err="failed to get container status \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": rpc error: code = NotFound desc = could not find container \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": container with ID starting with c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.085418 4909 scope.go:117] "RemoveContainer" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.085626 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": container with ID starting with bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d not found: ID does not exist" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.085642 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d"} err="failed to get container status \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": rpc error: code = NotFound desc = could not find container \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": container with ID starting with bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.085654 4909 scope.go:117] "RemoveContainer" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.085864 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": container with ID starting with f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf not found: ID does not exist" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.085884 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf"} err="failed to get container status \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": rpc error: code = NotFound desc = could not find container \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": container with ID starting with f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.085897 4909 scope.go:117] "RemoveContainer" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086113 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56"} err="failed to get container status \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": rpc error: code = NotFound desc = could not find container \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": container with ID starting with 7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56 not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086127 4909 scope.go:117] "RemoveContainer" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086314 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a"} err="failed to get container status \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": rpc error: code = NotFound desc = could not find container \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": container with ID starting with c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086328 4909 scope.go:117] "RemoveContainer" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086491 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d"} err="failed to get container status \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": rpc error: code = NotFound desc = could not find container \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": container with ID starting with bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086505 4909 scope.go:117] "RemoveContainer" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086667 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf"} err="failed to get container status \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": rpc error: code = NotFound desc = could not find container \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": container with ID starting with f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086679 4909 scope.go:117] "RemoveContainer" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086857 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56"} err="failed to get container status \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": rpc error: code = NotFound desc = could not find container \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": container with ID starting with 7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56 not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.086871 4909 scope.go:117] "RemoveContainer" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.087124 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a"} err="failed to get container status \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": rpc error: code = NotFound desc = could not find container \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": container with ID starting with c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.087142 4909 scope.go:117] "RemoveContainer" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.087329 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d"} err="failed to get container status \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": rpc error: code = NotFound desc = could not find container \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": container with ID starting with bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.087345 4909 scope.go:117] "RemoveContainer" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.088397 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf"} err="failed to get container status \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": rpc error: code = NotFound desc = could not find container \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": container with ID starting with f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.088418 4909 scope.go:117] "RemoveContainer" containerID="7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.088610 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56"} err="failed to get container status \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": rpc error: code = NotFound desc = could not find container \"7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56\": container with ID starting with 7b97264e5e753bc8496225a4e0c56a15d39074d6820fd85352de849af731de56 not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.088628 4909 scope.go:117] "RemoveContainer" containerID="c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.088822 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a"} err="failed to get container status \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": rpc error: code = NotFound desc = could not find container \"c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a\": container with ID starting with c055e3658de24825913303db041b58515d5d6bf864cd29e8440e65251739068a not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.088841 4909 scope.go:117] "RemoveContainer" containerID="bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.089111 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d"} err="failed to get container status \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": rpc error: code = NotFound desc = could not find container \"bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d\": container with ID starting with bc0999080796de12bdb52801258a297d222a0cf6feb05d115bfbb2cee8158a5d not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.089129 4909 scope.go:117] "RemoveContainer" containerID="f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.089318 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf"} err="failed to get container status \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": rpc error: code = NotFound desc = could not find container \"f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf\": container with ID starting with f05d7667cff4f60fa9bc6b9b09e4de95bd451d1e4452803813054e0759646baf not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166145 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-run-httpd\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-scripts\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-log-httpd\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166278 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166337 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-config-data\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.166387 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh5ws\" (UniqueName: \"kubernetes.io/projected/bccaa279-b390-4274-b4c8-30e1ee34d3cb-kube-api-access-lh5ws\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269370 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-config-data\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269450 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh5ws\" (UniqueName: \"kubernetes.io/projected/bccaa279-b390-4274-b4c8-30e1ee34d3cb-kube-api-access-lh5ws\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269504 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-run-httpd\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269563 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-scripts\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.269613 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-log-httpd\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.271746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-log-httpd\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.275471 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.275489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-run-httpd\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.277181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.277390 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-config-data\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.285419 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh5ws\" (UniqueName: \"kubernetes.io/projected/bccaa279-b390-4274-b4c8-30e1ee34d3cb-kube-api-access-lh5ws\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.285790 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-scripts\") pod \"ceilometer-0\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.374674 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.384279 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.578915 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-svc\") pod \"4941e170-ef64-42b6-b9a1-56deb33e252d\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.579175 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-nb\") pod \"4941e170-ef64-42b6-b9a1-56deb33e252d\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.579249 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-swift-storage-0\") pod \"4941e170-ef64-42b6-b9a1-56deb33e252d\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.579905 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-config\") pod \"4941e170-ef64-42b6-b9a1-56deb33e252d\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.579930 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d8z8\" (UniqueName: \"kubernetes.io/projected/4941e170-ef64-42b6-b9a1-56deb33e252d-kube-api-access-9d8z8\") pod \"4941e170-ef64-42b6-b9a1-56deb33e252d\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.579947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-sb\") pod \"4941e170-ef64-42b6-b9a1-56deb33e252d\" (UID: \"4941e170-ef64-42b6-b9a1-56deb33e252d\") " Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.608417 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4941e170-ef64-42b6-b9a1-56deb33e252d-kube-api-access-9d8z8" (OuterVolumeSpecName: "kube-api-access-9d8z8") pod "4941e170-ef64-42b6-b9a1-56deb33e252d" (UID: "4941e170-ef64-42b6-b9a1-56deb33e252d"). InnerVolumeSpecName "kube-api-access-9d8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.640695 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfeecec6-7723-4b63-9152-c9183e3f877d" path="/var/lib/kubelet/pods/dfeecec6-7723-4b63-9152-c9183e3f877d/volumes" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.681969 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d8z8\" (UniqueName: \"kubernetes.io/projected/4941e170-ef64-42b6-b9a1-56deb33e252d-kube-api-access-9d8z8\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.705286 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4941e170-ef64-42b6-b9a1-56deb33e252d" (UID: "4941e170-ef64-42b6-b9a1-56deb33e252d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.707869 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4941e170-ef64-42b6-b9a1-56deb33e252d" (UID: "4941e170-ef64-42b6-b9a1-56deb33e252d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.723649 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-config" (OuterVolumeSpecName: "config") pod "4941e170-ef64-42b6-b9a1-56deb33e252d" (UID: "4941e170-ef64-42b6-b9a1-56deb33e252d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.738428 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4941e170-ef64-42b6-b9a1-56deb33e252d" (UID: "4941e170-ef64-42b6-b9a1-56deb33e252d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.759973 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4941e170-ef64-42b6-b9a1-56deb33e252d" (UID: "4941e170-ef64-42b6-b9a1-56deb33e252d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.785463 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.785493 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.785504 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.785514 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.785522 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4941e170-ef64-42b6-b9a1-56deb33e252d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.832018 4909 generic.go:334] "Generic (PLEG): container finished" podID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerID="6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03" exitCode=0 Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.832268 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.857104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" event={"ID":"4941e170-ef64-42b6-b9a1-56deb33e252d","Type":"ContainerDied","Data":"6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03"} Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.857193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" event={"ID":"4941e170-ef64-42b6-b9a1-56deb33e252d","Type":"ContainerDied","Data":"3959718f01c1c86915613f546a8782643ee41fe405506816c775bd18c64c3038"} Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.857225 4909 scope.go:117] "RemoveContainer" containerID="6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.884306 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-pqdnn"] Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.900998 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-pqdnn"] Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.907991 4909 scope.go:117] "RemoveContainer" containerID="7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.912979 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.988005 4909 scope.go:117] "RemoveContainer" containerID="6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.989530 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03\": container with ID starting with 6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03 not found: ID does not exist" containerID="6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.989569 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03"} err="failed to get container status \"6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03\": rpc error: code = NotFound desc = could not find container \"6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03\": container with ID starting with 6bc7238d84c4676e045433635cecd64ba9c1eb132d9f56ee356c15029b047b03 not found: ID does not exist" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.989600 4909 scope.go:117] "RemoveContainer" containerID="7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.991838 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.992152 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.992497 4909 scope.go:117] "RemoveContainer" containerID="6ac2de9c4a28a05339d86a095b8954b39f8e09f488ed8d3d86a4c658247135e7" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.992889 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6cc9877f78-zbcsw_openstack(df23b609-c698-4f30-a349-df1fef294948)\"" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" podUID="df23b609-c698-4f30-a349-df1fef294948" Oct 02 18:41:11 crc kubenswrapper[4909]: E1002 18:41:11.992923 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924\": container with ID starting with 7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924 not found: ID does not exist" containerID="7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924" Oct 02 18:41:11 crc kubenswrapper[4909]: I1002 18:41:11.992950 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924"} err="failed to get container status \"7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924\": rpc error: code = NotFound desc = could not find container \"7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924\": container with ID starting with 7e1e1596d131cc695814cbb379c0d1e9f0c20efa98a7c18afbae388b3ce99924 not found: ID does not exist" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.057240 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.057308 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.058187 4909 scope.go:117] "RemoveContainer" containerID="b453517070d4009aa1a49bc59c905d17cc110819691b45b409bfa5790dafd40c" Oct 02 18:41:12 crc kubenswrapper[4909]: E1002 18:41:12.058463 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-8578895c89-hpjqt_openstack(e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf)\"" pod="openstack/heat-api-8578895c89-hpjqt" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.329305 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.398856 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqpdc\" (UniqueName: \"kubernetes.io/projected/c4a1121e-8850-4ae8-9f23-058732d8cf96-kube-api-access-qqpdc\") pod \"c4a1121e-8850-4ae8-9f23-058732d8cf96\" (UID: \"c4a1121e-8850-4ae8-9f23-058732d8cf96\") " Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.415141 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a1121e-8850-4ae8-9f23-058732d8cf96-kube-api-access-qqpdc" (OuterVolumeSpecName: "kube-api-access-qqpdc") pod "c4a1121e-8850-4ae8-9f23-058732d8cf96" (UID: "c4a1121e-8850-4ae8-9f23-058732d8cf96"). InnerVolumeSpecName "kube-api-access-qqpdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.415781 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.473811 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.504575 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqpdc\" (UniqueName: \"kubernetes.io/projected/c4a1121e-8850-4ae8-9f23-058732d8cf96-kube-api-access-qqpdc\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.520244 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.536947 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.708417 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fglqq\" (UniqueName: \"kubernetes.io/projected/a41e0c4c-75a5-47e0-8e07-4b851ff9feda-kube-api-access-fglqq\") pod \"a41e0c4c-75a5-47e0-8e07-4b851ff9feda\" (UID: \"a41e0c4c-75a5-47e0-8e07-4b851ff9feda\") " Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.709314 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z68f\" (UniqueName: \"kubernetes.io/projected/930a4d62-49c6-4a03-87d1-a91e08c5f01f-kube-api-access-2z68f\") pod \"930a4d62-49c6-4a03-87d1-a91e08c5f01f\" (UID: \"930a4d62-49c6-4a03-87d1-a91e08c5f01f\") " Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.712892 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41e0c4c-75a5-47e0-8e07-4b851ff9feda-kube-api-access-fglqq" (OuterVolumeSpecName: "kube-api-access-fglqq") pod "a41e0c4c-75a5-47e0-8e07-4b851ff9feda" (UID: "a41e0c4c-75a5-47e0-8e07-4b851ff9feda"). InnerVolumeSpecName "kube-api-access-fglqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.713882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930a4d62-49c6-4a03-87d1-a91e08c5f01f-kube-api-access-2z68f" (OuterVolumeSpecName: "kube-api-access-2z68f") pod "930a4d62-49c6-4a03-87d1-a91e08c5f01f" (UID: "930a4d62-49c6-4a03-87d1-a91e08c5f01f"). InnerVolumeSpecName "kube-api-access-2z68f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.812433 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fglqq\" (UniqueName: \"kubernetes.io/projected/a41e0c4c-75a5-47e0-8e07-4b851ff9feda-kube-api-access-fglqq\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.812578 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z68f\" (UniqueName: \"kubernetes.io/projected/930a4d62-49c6-4a03-87d1-a91e08c5f01f-kube-api-access-2z68f\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.846839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerStarted","Data":"5210280377efb97da148312ac44bc502f7873524eb8fb362773c0fe62826bc7c"} Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.847095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerStarted","Data":"972cf3a116bbb95a0161a6d21825fdc0dc517eacb076d079509975db84cdad9f"} Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.848677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-r29fp" event={"ID":"c4a1121e-8850-4ae8-9f23-058732d8cf96","Type":"ContainerDied","Data":"bdf4903ba9438504766588e673b55308a86f32a687918d4ec53073d78c6565eb"} Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.848825 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf4903ba9438504766588e673b55308a86f32a687918d4ec53073d78c6565eb" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.848689 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-r29fp" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.851892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8c5jp" event={"ID":"930a4d62-49c6-4a03-87d1-a91e08c5f01f","Type":"ContainerDied","Data":"84bcdf27943b19cce2867504d511fb462156a70a5ac2db496fc9d2838ec106a3"} Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.851998 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84bcdf27943b19cce2867504d511fb462156a70a5ac2db496fc9d2838ec106a3" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.852144 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8c5jp" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.860889 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zcxfk" event={"ID":"a41e0c4c-75a5-47e0-8e07-4b851ff9feda","Type":"ContainerDied","Data":"86d210b0cb20741fc05485eab31fa985c44ee518f88b18440a585c1cce1ba718"} Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.860934 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d210b0cb20741fc05485eab31fa985c44ee518f88b18440a585c1cce1ba718" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.860994 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zcxfk" Oct 02 18:41:12 crc kubenswrapper[4909]: I1002 18:41:12.865241 4909 scope.go:117] "RemoveContainer" containerID="6ac2de9c4a28a05339d86a095b8954b39f8e09f488ed8d3d86a4c658247135e7" Oct 02 18:41:12 crc kubenswrapper[4909]: E1002 18:41:12.865601 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6cc9877f78-zbcsw_openstack(df23b609-c698-4f30-a349-df1fef294948)\"" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" podUID="df23b609-c698-4f30-a349-df1fef294948" Oct 02 18:41:13 crc kubenswrapper[4909]: I1002 18:41:13.622598 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" path="/var/lib/kubelet/pods/4941e170-ef64-42b6-b9a1-56deb33e252d/volumes" Oct 02 18:41:14 crc kubenswrapper[4909]: I1002 18:41:14.863313 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:41:14 crc kubenswrapper[4909]: I1002 18:41:14.886835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerStarted","Data":"13a546b3014a29fa89177c8c83fafc52dfcbe971f9902bc606b89a3e4df35453"} Oct 02 18:41:14 crc kubenswrapper[4909]: I1002 18:41:14.886882 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerStarted","Data":"641861bb98cbb7ca585238374bb76b226a124d41acdb50477e8cde66122f71a7"} Oct 02 18:41:14 crc kubenswrapper[4909]: I1002 18:41:14.930164 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8578895c89-hpjqt"] Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.023199 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.085921 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6cc9877f78-zbcsw"] Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.520996 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.551400 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.559968 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.573451 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data\") pod \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580554 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-combined-ca-bundle\") pod \"df23b609-c698-4f30-a349-df1fef294948\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58grw\" (UniqueName: \"kubernetes.io/projected/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-kube-api-access-58grw\") pod \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580722 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data\") pod \"df23b609-c698-4f30-a349-df1fef294948\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data-custom\") pod \"df23b609-c698-4f30-a349-df1fef294948\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580862 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p97v\" (UniqueName: \"kubernetes.io/projected/df23b609-c698-4f30-a349-df1fef294948-kube-api-access-7p97v\") pod \"df23b609-c698-4f30-a349-df1fef294948\" (UID: \"df23b609-c698-4f30-a349-df1fef294948\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data-custom\") pod \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.580982 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-combined-ca-bundle\") pod \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\" (UID: \"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf\") " Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.600565 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-kube-api-access-58grw" (OuterVolumeSpecName: "kube-api-access-58grw") pod "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" (UID: "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf"). InnerVolumeSpecName "kube-api-access-58grw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.614776 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" (UID: "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.616871 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df23b609-c698-4f30-a349-df1fef294948" (UID: "df23b609-c698-4f30-a349-df1fef294948"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.621745 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df23b609-c698-4f30-a349-df1fef294948-kube-api-access-7p97v" (OuterVolumeSpecName: "kube-api-access-7p97v") pod "df23b609-c698-4f30-a349-df1fef294948" (UID: "df23b609-c698-4f30-a349-df1fef294948"). InnerVolumeSpecName "kube-api-access-7p97v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.642349 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df23b609-c698-4f30-a349-df1fef294948" (UID: "df23b609-c698-4f30-a349-df1fef294948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.665609 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" (UID: "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.667262 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data" (OuterVolumeSpecName: "config-data") pod "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" (UID: "e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683664 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683694 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p97v\" (UniqueName: \"kubernetes.io/projected/df23b609-c698-4f30-a349-df1fef294948-kube-api-access-7p97v\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683704 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683713 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683721 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683729 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.683737 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58grw\" (UniqueName: \"kubernetes.io/projected/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf-kube-api-access-58grw\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.708897 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data" (OuterVolumeSpecName: "config-data") pod "df23b609-c698-4f30-a349-df1fef294948" (UID: "df23b609-c698-4f30-a349-df1fef294948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.785395 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23b609-c698-4f30-a349-df1fef294948-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.928866 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8578895c89-hpjqt" event={"ID":"e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf","Type":"ContainerDied","Data":"86e684c585bc150e66534991573a56b7de239aeccec7a3f86fffa6ceb2cd5aa6"} Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.928919 4909 scope.go:117] "RemoveContainer" containerID="b453517070d4009aa1a49bc59c905d17cc110819691b45b409bfa5790dafd40c" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.928973 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8578895c89-hpjqt" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.935451 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" event={"ID":"df23b609-c698-4f30-a349-df1fef294948","Type":"ContainerDied","Data":"9d19c47d3e080e386c200eb5b1f4567b117324df3aeed824cda0b001c77db7da"} Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.935529 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cc9877f78-zbcsw" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.963309 4909 scope.go:117] "RemoveContainer" containerID="6ac2de9c4a28a05339d86a095b8954b39f8e09f488ed8d3d86a4c658247135e7" Oct 02 18:41:15 crc kubenswrapper[4909]: I1002 18:41:15.963879 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6cc9877f78-zbcsw"] Oct 02 18:41:16 crc kubenswrapper[4909]: I1002 18:41:16.019306 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6cc9877f78-zbcsw"] Oct 02 18:41:16 crc kubenswrapper[4909]: I1002 18:41:16.035101 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8578895c89-hpjqt"] Oct 02 18:41:16 crc kubenswrapper[4909]: I1002 18:41:16.036848 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-pqdnn" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: i/o timeout" Oct 02 18:41:16 crc kubenswrapper[4909]: I1002 18:41:16.051277 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-8578895c89-hpjqt"] Oct 02 18:41:16 crc kubenswrapper[4909]: I1002 18:41:16.948125 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerStarted","Data":"f04f4a7980126427a1fa9b15d7d70ab299c62296897dd188530fa2b2c0b8764a"} Oct 02 18:41:16 crc kubenswrapper[4909]: I1002 18:41:16.949590 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:41:17 crc kubenswrapper[4909]: I1002 18:41:17.620485 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df23b609-c698-4f30-a349-df1fef294948" path="/var/lib/kubelet/pods/df23b609-c698-4f30-a349-df1fef294948/volumes" Oct 02 18:41:17 crc kubenswrapper[4909]: I1002 18:41:17.621238 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" path="/var/lib/kubelet/pods/e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf/volumes" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.768344 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.953641645 podStartE2EDuration="8.768321752s" podCreationTimestamp="2025-10-02 18:41:10 +0000 UTC" firstStartedPulling="2025-10-02 18:41:11.916616386 +0000 UTC m=+1393.104112245" lastFinishedPulling="2025-10-02 18:41:15.731296493 +0000 UTC m=+1396.918792352" observedRunningTime="2025-10-02 18:41:16.972364762 +0000 UTC m=+1398.159860621" watchObservedRunningTime="2025-10-02 18:41:18.768321752 +0000 UTC m=+1399.955817611" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.768836 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-dcf5-account-create-bt9p8"] Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769339 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerName="heat-api" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769358 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerName="heat-api" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769374 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41e0c4c-75a5-47e0-8e07-4b851ff9feda" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769380 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41e0c4c-75a5-47e0-8e07-4b851ff9feda" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769391 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df23b609-c698-4f30-a349-df1fef294948" containerName="heat-cfnapi" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769397 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="df23b609-c698-4f30-a349-df1fef294948" containerName="heat-cfnapi" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769410 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df23b609-c698-4f30-a349-df1fef294948" containerName="heat-cfnapi" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769416 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="df23b609-c698-4f30-a349-df1fef294948" containerName="heat-cfnapi" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769428 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a1121e-8850-4ae8-9f23-058732d8cf96" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769434 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a1121e-8850-4ae8-9f23-058732d8cf96" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="init" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769450 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="init" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769468 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930a4d62-49c6-4a03-87d1-a91e08c5f01f" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769473 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="930a4d62-49c6-4a03-87d1-a91e08c5f01f" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.769489 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="dnsmasq-dns" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769494 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="dnsmasq-dns" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769708 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="930a4d62-49c6-4a03-87d1-a91e08c5f01f" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769719 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerName="heat-api" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769728 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="df23b609-c698-4f30-a349-df1fef294948" containerName="heat-cfnapi" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769738 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerName="heat-api" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769750 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4941e170-ef64-42b6-b9a1-56deb33e252d" containerName="dnsmasq-dns" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769761 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a1121e-8850-4ae8-9f23-058732d8cf96" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769773 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="df23b609-c698-4f30-a349-df1fef294948" containerName="heat-cfnapi" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.769789 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41e0c4c-75a5-47e0-8e07-4b851ff9feda" containerName="mariadb-database-create" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.770539 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.772207 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.781694 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dcf5-account-create-bt9p8"] Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.950054 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmxz\" (UniqueName: \"kubernetes.io/projected/bd7ec535-cc21-449a-88f0-efdf67c0119f-kube-api-access-5wmxz\") pod \"nova-api-dcf5-account-create-bt9p8\" (UID: \"bd7ec535-cc21-449a-88f0-efdf67c0119f\") " pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.963641 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5e90-account-create-4mc5m"] Oct 02 18:41:18 crc kubenswrapper[4909]: E1002 18:41:18.964231 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerName="heat-api" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.964254 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e918212c-fc0d-49e8-b1d0-c4fb7e1ce7bf" containerName="heat-api" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.965464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.967703 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 18:41:18 crc kubenswrapper[4909]: I1002 18:41:18.980637 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5e90-account-create-4mc5m"] Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.051346 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmxz\" (UniqueName: \"kubernetes.io/projected/bd7ec535-cc21-449a-88f0-efdf67c0119f-kube-api-access-5wmxz\") pod \"nova-api-dcf5-account-create-bt9p8\" (UID: \"bd7ec535-cc21-449a-88f0-efdf67c0119f\") " pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.072650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmxz\" (UniqueName: \"kubernetes.io/projected/bd7ec535-cc21-449a-88f0-efdf67c0119f-kube-api-access-5wmxz\") pod \"nova-api-dcf5-account-create-bt9p8\" (UID: \"bd7ec535-cc21-449a-88f0-efdf67c0119f\") " pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.100292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.156474 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kchd4\" (UniqueName: \"kubernetes.io/projected/3b6f61e1-d6fa-438d-9382-d9548af74f3f-kube-api-access-kchd4\") pod \"nova-cell0-5e90-account-create-4mc5m\" (UID: \"3b6f61e1-d6fa-438d-9382-d9548af74f3f\") " pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.159971 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-752e-account-create-xqbw8"] Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.161332 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.166480 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.177933 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-752e-account-create-xqbw8"] Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.262191 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kchd4\" (UniqueName: \"kubernetes.io/projected/3b6f61e1-d6fa-438d-9382-d9548af74f3f-kube-api-access-kchd4\") pod \"nova-cell0-5e90-account-create-4mc5m\" (UID: \"3b6f61e1-d6fa-438d-9382-d9548af74f3f\") " pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.262590 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx2jr\" (UniqueName: \"kubernetes.io/projected/37a27b3d-9b2f-4734-988b-e72baa85082d-kube-api-access-rx2jr\") pod \"nova-cell1-752e-account-create-xqbw8\" (UID: \"37a27b3d-9b2f-4734-988b-e72baa85082d\") " pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.287710 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kchd4\" (UniqueName: \"kubernetes.io/projected/3b6f61e1-d6fa-438d-9382-d9548af74f3f-kube-api-access-kchd4\") pod \"nova-cell0-5e90-account-create-4mc5m\" (UID: \"3b6f61e1-d6fa-438d-9382-d9548af74f3f\") " pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.288190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.366657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx2jr\" (UniqueName: \"kubernetes.io/projected/37a27b3d-9b2f-4734-988b-e72baa85082d-kube-api-access-rx2jr\") pod \"nova-cell1-752e-account-create-xqbw8\" (UID: \"37a27b3d-9b2f-4734-988b-e72baa85082d\") " pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.383509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx2jr\" (UniqueName: \"kubernetes.io/projected/37a27b3d-9b2f-4734-988b-e72baa85082d-kube-api-access-rx2jr\") pod \"nova-cell1-752e-account-create-xqbw8\" (UID: \"37a27b3d-9b2f-4734-988b-e72baa85082d\") " pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.536814 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.650226 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dcf5-account-create-bt9p8"] Oct 02 18:41:19 crc kubenswrapper[4909]: W1002 18:41:19.655765 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd7ec535_cc21_449a_88f0_efdf67c0119f.slice/crio-262f571a9158e228ea8632997f44256f514d2ca52e5e20c6cbed274e2132e1a5 WatchSource:0}: Error finding container 262f571a9158e228ea8632997f44256f514d2ca52e5e20c6cbed274e2132e1a5: Status 404 returned error can't find the container with id 262f571a9158e228ea8632997f44256f514d2ca52e5e20c6cbed274e2132e1a5 Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.782510 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5e90-account-create-4mc5m"] Oct 02 18:41:19 crc kubenswrapper[4909]: W1002 18:41:19.793580 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b6f61e1_d6fa_438d_9382_d9548af74f3f.slice/crio-ffd23e2dec32e5eedfd285756401d0be1c660b9e7bb0d24d3cc578a94c2fec71 WatchSource:0}: Error finding container ffd23e2dec32e5eedfd285756401d0be1c660b9e7bb0d24d3cc578a94c2fec71: Status 404 returned error can't find the container with id ffd23e2dec32e5eedfd285756401d0be1c660b9e7bb0d24d3cc578a94c2fec71 Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.998303 4909 generic.go:334] "Generic (PLEG): container finished" podID="bd7ec535-cc21-449a-88f0-efdf67c0119f" containerID="c87168b1cf9ab041982b89023cd2d10690bb35dbb16e9436113a665069e2e1f5" exitCode=0 Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.998581 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dcf5-account-create-bt9p8" event={"ID":"bd7ec535-cc21-449a-88f0-efdf67c0119f","Type":"ContainerDied","Data":"c87168b1cf9ab041982b89023cd2d10690bb35dbb16e9436113a665069e2e1f5"} Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.998607 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dcf5-account-create-bt9p8" event={"ID":"bd7ec535-cc21-449a-88f0-efdf67c0119f","Type":"ContainerStarted","Data":"262f571a9158e228ea8632997f44256f514d2ca52e5e20c6cbed274e2132e1a5"} Oct 02 18:41:19 crc kubenswrapper[4909]: I1002 18:41:19.999967 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5e90-account-create-4mc5m" event={"ID":"3b6f61e1-d6fa-438d-9382-d9548af74f3f","Type":"ContainerStarted","Data":"ffd23e2dec32e5eedfd285756401d0be1c660b9e7bb0d24d3cc578a94c2fec71"} Oct 02 18:41:20 crc kubenswrapper[4909]: I1002 18:41:20.118663 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-752e-account-create-xqbw8"] Oct 02 18:41:20 crc kubenswrapper[4909]: W1002 18:41:20.120825 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37a27b3d_9b2f_4734_988b_e72baa85082d.slice/crio-e8c513a09e84879abdf46ee303e62e3978be785bf19abfa972f145088b9dbd36 WatchSource:0}: Error finding container e8c513a09e84879abdf46ee303e62e3978be785bf19abfa972f145088b9dbd36: Status 404 returned error can't find the container with id e8c513a09e84879abdf46ee303e62e3978be785bf19abfa972f145088b9dbd36 Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.013272 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b6f61e1-d6fa-438d-9382-d9548af74f3f" containerID="040c561f2c96fa18b46a5a71e966700017bf9ab5381acb6a5173564b27c33506" exitCode=0 Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.013384 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5e90-account-create-4mc5m" event={"ID":"3b6f61e1-d6fa-438d-9382-d9548af74f3f","Type":"ContainerDied","Data":"040c561f2c96fa18b46a5a71e966700017bf9ab5381acb6a5173564b27c33506"} Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.015832 4909 generic.go:334] "Generic (PLEG): container finished" podID="37a27b3d-9b2f-4734-988b-e72baa85082d" containerID="933b48d8afb9e3d36873eedf6cf8456da8b6c05dbda6c1d9c7f32c4cf3344cea" exitCode=0 Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.015879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-752e-account-create-xqbw8" event={"ID":"37a27b3d-9b2f-4734-988b-e72baa85082d","Type":"ContainerDied","Data":"933b48d8afb9e3d36873eedf6cf8456da8b6c05dbda6c1d9c7f32c4cf3344cea"} Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.015929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-752e-account-create-xqbw8" event={"ID":"37a27b3d-9b2f-4734-988b-e72baa85082d","Type":"ContainerStarted","Data":"e8c513a09e84879abdf46ee303e62e3978be785bf19abfa972f145088b9dbd36"} Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.539982 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.726411 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wmxz\" (UniqueName: \"kubernetes.io/projected/bd7ec535-cc21-449a-88f0-efdf67c0119f-kube-api-access-5wmxz\") pod \"bd7ec535-cc21-449a-88f0-efdf67c0119f\" (UID: \"bd7ec535-cc21-449a-88f0-efdf67c0119f\") " Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.736283 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7ec535-cc21-449a-88f0-efdf67c0119f-kube-api-access-5wmxz" (OuterVolumeSpecName: "kube-api-access-5wmxz") pod "bd7ec535-cc21-449a-88f0-efdf67c0119f" (UID: "bd7ec535-cc21-449a-88f0-efdf67c0119f"). InnerVolumeSpecName "kube-api-access-5wmxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:21 crc kubenswrapper[4909]: I1002 18:41:21.828830 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wmxz\" (UniqueName: \"kubernetes.io/projected/bd7ec535-cc21-449a-88f0-efdf67c0119f-kube-api-access-5wmxz\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.033287 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dcf5-account-create-bt9p8" event={"ID":"bd7ec535-cc21-449a-88f0-efdf67c0119f","Type":"ContainerDied","Data":"262f571a9158e228ea8632997f44256f514d2ca52e5e20c6cbed274e2132e1a5"} Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.033357 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262f571a9158e228ea8632997f44256f514d2ca52e5e20c6cbed274e2132e1a5" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.033386 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dcf5-account-create-bt9p8" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.074534 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.124898 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-78c47d4bdf-9rk8w"] Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.125312 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-78c47d4bdf-9rk8w" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerName="heat-engine" containerID="cri-o://1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" gracePeriod=60 Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.669894 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.676269 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.747569 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx2jr\" (UniqueName: \"kubernetes.io/projected/37a27b3d-9b2f-4734-988b-e72baa85082d-kube-api-access-rx2jr\") pod \"37a27b3d-9b2f-4734-988b-e72baa85082d\" (UID: \"37a27b3d-9b2f-4734-988b-e72baa85082d\") " Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.747879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kchd4\" (UniqueName: \"kubernetes.io/projected/3b6f61e1-d6fa-438d-9382-d9548af74f3f-kube-api-access-kchd4\") pod \"3b6f61e1-d6fa-438d-9382-d9548af74f3f\" (UID: \"3b6f61e1-d6fa-438d-9382-d9548af74f3f\") " Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.763634 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6f61e1-d6fa-438d-9382-d9548af74f3f-kube-api-access-kchd4" (OuterVolumeSpecName: "kube-api-access-kchd4") pod "3b6f61e1-d6fa-438d-9382-d9548af74f3f" (UID: "3b6f61e1-d6fa-438d-9382-d9548af74f3f"). InnerVolumeSpecName "kube-api-access-kchd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.763946 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a27b3d-9b2f-4734-988b-e72baa85082d-kube-api-access-rx2jr" (OuterVolumeSpecName: "kube-api-access-rx2jr") pod "37a27b3d-9b2f-4734-988b-e72baa85082d" (UID: "37a27b3d-9b2f-4734-988b-e72baa85082d"). InnerVolumeSpecName "kube-api-access-rx2jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.849647 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx2jr\" (UniqueName: \"kubernetes.io/projected/37a27b3d-9b2f-4734-988b-e72baa85082d-kube-api-access-rx2jr\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:22 crc kubenswrapper[4909]: I1002 18:41:22.849684 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kchd4\" (UniqueName: \"kubernetes.io/projected/3b6f61e1-d6fa-438d-9382-d9548af74f3f-kube-api-access-kchd4\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.054511 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.054574 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.054619 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.055193 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df159098c9d3023b7a6d6812151d5a2e4c4eebbf2424f0b96db2104b354ed569"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.055255 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://df159098c9d3023b7a6d6812151d5a2e4c4eebbf2424f0b96db2104b354ed569" gracePeriod=600 Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.057971 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5e90-account-create-4mc5m" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.057964 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5e90-account-create-4mc5m" event={"ID":"3b6f61e1-d6fa-438d-9382-d9548af74f3f","Type":"ContainerDied","Data":"ffd23e2dec32e5eedfd285756401d0be1c660b9e7bb0d24d3cc578a94c2fec71"} Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.058035 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd23e2dec32e5eedfd285756401d0be1c660b9e7bb0d24d3cc578a94c2fec71" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.065771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-752e-account-create-xqbw8" event={"ID":"37a27b3d-9b2f-4734-988b-e72baa85082d","Type":"ContainerDied","Data":"e8c513a09e84879abdf46ee303e62e3978be785bf19abfa972f145088b9dbd36"} Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.065805 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-752e-account-create-xqbw8" Oct 02 18:41:23 crc kubenswrapper[4909]: I1002 18:41:23.065816 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8c513a09e84879abdf46ee303e62e3978be785bf19abfa972f145088b9dbd36" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.088407 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="df159098c9d3023b7a6d6812151d5a2e4c4eebbf2424f0b96db2104b354ed569" exitCode=0 Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.088688 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"df159098c9d3023b7a6d6812151d5a2e4c4eebbf2424f0b96db2104b354ed569"} Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.089118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142"} Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.089156 4909 scope.go:117] "RemoveContainer" containerID="cdce1121ba765dae4e3ecbe0b01dbaa7f404571af7f88aa30e85e68bfec50aa5" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.357738 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tjnr"] Oct 02 18:41:24 crc kubenswrapper[4909]: E1002 18:41:24.358152 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7ec535-cc21-449a-88f0-efdf67c0119f" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.358167 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7ec535-cc21-449a-88f0-efdf67c0119f" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: E1002 18:41:24.358181 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a27b3d-9b2f-4734-988b-e72baa85082d" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.358187 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a27b3d-9b2f-4734-988b-e72baa85082d" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: E1002 18:41:24.358215 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6f61e1-d6fa-438d-9382-d9548af74f3f" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.358221 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6f61e1-d6fa-438d-9382-d9548af74f3f" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.358415 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a27b3d-9b2f-4734-988b-e72baa85082d" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.358425 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7ec535-cc21-449a-88f0-efdf67c0119f" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.358432 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b6f61e1-d6fa-438d-9382-d9548af74f3f" containerName="mariadb-account-create" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.359084 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.364811 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.365094 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-svq85" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.369818 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.382391 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tjnr"] Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.382572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-config-data\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.382784 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.382825 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqwp\" (UniqueName: \"kubernetes.io/projected/68c9f259-4615-48e3-9c0c-89c320e100e0-kube-api-access-bpqwp\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.382862 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-scripts\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.485043 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.485097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqwp\" (UniqueName: \"kubernetes.io/projected/68c9f259-4615-48e3-9c0c-89c320e100e0-kube-api-access-bpqwp\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.485127 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-scripts\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.485226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-config-data\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.495174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.495440 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-scripts\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.503650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-config-data\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.513611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqwp\" (UniqueName: \"kubernetes.io/projected/68c9f259-4615-48e3-9c0c-89c320e100e0-kube-api-access-bpqwp\") pod \"nova-cell0-conductor-db-sync-4tjnr\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:24 crc kubenswrapper[4909]: I1002 18:41:24.678265 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:25 crc kubenswrapper[4909]: I1002 18:41:25.271670 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tjnr"] Oct 02 18:41:25 crc kubenswrapper[4909]: E1002 18:41:25.472208 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:41:25 crc kubenswrapper[4909]: E1002 18:41:25.474055 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:41:25 crc kubenswrapper[4909]: E1002 18:41:25.475969 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:41:25 crc kubenswrapper[4909]: E1002 18:41:25.476009 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-78c47d4bdf-9rk8w" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerName="heat-engine" Oct 02 18:41:26 crc kubenswrapper[4909]: I1002 18:41:26.117300 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" event={"ID":"68c9f259-4615-48e3-9c0c-89c320e100e0","Type":"ContainerStarted","Data":"f5aa29b7ed224d54dc30a882d282198fda23721106f7750c3fd1141ad6d1d001"} Oct 02 18:41:28 crc kubenswrapper[4909]: I1002 18:41:28.859566 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:28 crc kubenswrapper[4909]: I1002 18:41:28.860239 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-central-agent" containerID="cri-o://5210280377efb97da148312ac44bc502f7873524eb8fb362773c0fe62826bc7c" gracePeriod=30 Oct 02 18:41:28 crc kubenswrapper[4909]: I1002 18:41:28.860274 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="proxy-httpd" containerID="cri-o://f04f4a7980126427a1fa9b15d7d70ab299c62296897dd188530fa2b2c0b8764a" gracePeriod=30 Oct 02 18:41:28 crc kubenswrapper[4909]: I1002 18:41:28.860368 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-notification-agent" containerID="cri-o://641861bb98cbb7ca585238374bb76b226a124d41acdb50477e8cde66122f71a7" gracePeriod=30 Oct 02 18:41:28 crc kubenswrapper[4909]: I1002 18:41:28.860420 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="sg-core" containerID="cri-o://13a546b3014a29fa89177c8c83fafc52dfcbe971f9902bc606b89a3e4df35453" gracePeriod=30 Oct 02 18:41:28 crc kubenswrapper[4909]: I1002 18:41:28.884650 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 02 18:41:29 crc kubenswrapper[4909]: I1002 18:41:29.157093 4909 generic.go:334] "Generic (PLEG): container finished" podID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerID="f04f4a7980126427a1fa9b15d7d70ab299c62296897dd188530fa2b2c0b8764a" exitCode=0 Oct 02 18:41:29 crc kubenswrapper[4909]: I1002 18:41:29.157335 4909 generic.go:334] "Generic (PLEG): container finished" podID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerID="13a546b3014a29fa89177c8c83fafc52dfcbe971f9902bc606b89a3e4df35453" exitCode=2 Oct 02 18:41:29 crc kubenswrapper[4909]: I1002 18:41:29.157154 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerDied","Data":"f04f4a7980126427a1fa9b15d7d70ab299c62296897dd188530fa2b2c0b8764a"} Oct 02 18:41:29 crc kubenswrapper[4909]: I1002 18:41:29.157373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerDied","Data":"13a546b3014a29fa89177c8c83fafc52dfcbe971f9902bc606b89a3e4df35453"} Oct 02 18:41:30 crc kubenswrapper[4909]: I1002 18:41:30.171961 4909 generic.go:334] "Generic (PLEG): container finished" podID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerID="641861bb98cbb7ca585238374bb76b226a124d41acdb50477e8cde66122f71a7" exitCode=0 Oct 02 18:41:30 crc kubenswrapper[4909]: I1002 18:41:30.171997 4909 generic.go:334] "Generic (PLEG): container finished" podID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerID="5210280377efb97da148312ac44bc502f7873524eb8fb362773c0fe62826bc7c" exitCode=0 Oct 02 18:41:30 crc kubenswrapper[4909]: I1002 18:41:30.172022 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerDied","Data":"641861bb98cbb7ca585238374bb76b226a124d41acdb50477e8cde66122f71a7"} Oct 02 18:41:30 crc kubenswrapper[4909]: I1002 18:41:30.172080 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerDied","Data":"5210280377efb97da148312ac44bc502f7873524eb8fb362773c0fe62826bc7c"} Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.121862 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.219178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" event={"ID":"68c9f259-4615-48e3-9c0c-89c320e100e0","Type":"ContainerStarted","Data":"c192334de91920dea5030d7f865209eacf5b8e42b49b13ee26662b61c15d9117"} Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.222631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bccaa279-b390-4274-b4c8-30e1ee34d3cb","Type":"ContainerDied","Data":"972cf3a116bbb95a0161a6d21825fdc0dc517eacb076d079509975db84cdad9f"} Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.222679 4909 scope.go:117] "RemoveContainer" containerID="f04f4a7980126427a1fa9b15d7d70ab299c62296897dd188530fa2b2c0b8764a" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.222701 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.222947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-sg-core-conf-yaml\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.223102 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-config-data\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.223230 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-run-httpd\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.223270 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-log-httpd\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.223394 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-scripts\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.223679 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.224001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh5ws\" (UniqueName: \"kubernetes.io/projected/bccaa279-b390-4274-b4c8-30e1ee34d3cb-kube-api-access-lh5ws\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.224128 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-combined-ca-bundle\") pod \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\" (UID: \"bccaa279-b390-4274-b4c8-30e1ee34d3cb\") " Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.224780 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.227581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.227961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccaa279-b390-4274-b4c8-30e1ee34d3cb-kube-api-access-lh5ws" (OuterVolumeSpecName: "kube-api-access-lh5ws") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "kube-api-access-lh5ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.232234 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-scripts" (OuterVolumeSpecName: "scripts") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.249448 4909 scope.go:117] "RemoveContainer" containerID="13a546b3014a29fa89177c8c83fafc52dfcbe971f9902bc606b89a3e4df35453" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.265937 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.273444 4909 scope.go:117] "RemoveContainer" containerID="641861bb98cbb7ca585238374bb76b226a124d41acdb50477e8cde66122f71a7" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.295971 4909 scope.go:117] "RemoveContainer" containerID="5210280377efb97da148312ac44bc502f7873524eb8fb362773c0fe62826bc7c" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.312580 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.326587 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bccaa279-b390-4274-b4c8-30e1ee34d3cb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.326613 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.326624 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh5ws\" (UniqueName: \"kubernetes.io/projected/bccaa279-b390-4274-b4c8-30e1ee34d3cb-kube-api-access-lh5ws\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.326633 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.326641 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.361990 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-config-data" (OuterVolumeSpecName: "config-data") pod "bccaa279-b390-4274-b4c8-30e1ee34d3cb" (UID: "bccaa279-b390-4274-b4c8-30e1ee34d3cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.428395 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccaa279-b390-4274-b4c8-30e1ee34d3cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.471687 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.474230 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.475872 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.476015 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-78c47d4bdf-9rk8w" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerName="heat-engine" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.562384 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" podStartSLOduration=2.06326173 podStartE2EDuration="11.562359709s" podCreationTimestamp="2025-10-02 18:41:24 +0000 UTC" firstStartedPulling="2025-10-02 18:41:25.270586898 +0000 UTC m=+1406.458082757" lastFinishedPulling="2025-10-02 18:41:34.769684877 +0000 UTC m=+1415.957180736" observedRunningTime="2025-10-02 18:41:35.243464504 +0000 UTC m=+1416.430960363" watchObservedRunningTime="2025-10-02 18:41:35.562359709 +0000 UTC m=+1416.749855568" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.569831 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.582332 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.603375 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.603810 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-central-agent" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.603828 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-central-agent" Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.603843 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="proxy-httpd" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.603850 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="proxy-httpd" Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.603867 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="sg-core" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.603874 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="sg-core" Oct 02 18:41:35 crc kubenswrapper[4909]: E1002 18:41:35.603886 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-notification-agent" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.603892 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-notification-agent" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.604106 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-central-agent" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.604122 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="proxy-httpd" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.604141 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="ceilometer-notification-agent" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.604153 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" containerName="sg-core" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.606287 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.625383 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccaa279-b390-4274-b4c8-30e1ee34d3cb" path="/var/lib/kubelet/pods/bccaa279-b390-4274-b4c8-30e1ee34d3cb/volumes" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.627599 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.627717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.630654 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.733586 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5np9\" (UniqueName: \"kubernetes.io/projected/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-kube-api-access-z5np9\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.733936 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-config-data\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.733971 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-log-httpd\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.734080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.734248 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-scripts\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.734372 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.734404 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-run-httpd\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5np9\" (UniqueName: \"kubernetes.io/projected/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-kube-api-access-z5np9\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836113 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-config-data\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836144 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-log-httpd\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836252 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-scripts\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836324 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-run-httpd\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-log-httpd\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.836676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-run-httpd\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.841103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.842782 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-scripts\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.844705 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.851821 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5np9\" (UniqueName: \"kubernetes.io/projected/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-kube-api-access-z5np9\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.853451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-config-data\") pod \"ceilometer-0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " pod="openstack/ceilometer-0" Oct 02 18:41:35 crc kubenswrapper[4909]: I1002 18:41:35.931218 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:36 crc kubenswrapper[4909]: I1002 18:41:36.453995 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:36 crc kubenswrapper[4909]: I1002 18:41:36.601906 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.254117 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerStarted","Data":"29c3c6affc72c4b762093df2f02bca20a7be58f002a17fc28d71b7befed87b2b"} Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.254736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerStarted","Data":"4d5b944ddda930c70b568169d8c5c32b535c95fe4f87e8882577eba773d502ed"} Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.841014 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.976633 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data-custom\") pod \"d77d6d3d-274d-46a3-b884-9acbd81526b2\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.976777 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77hdh\" (UniqueName: \"kubernetes.io/projected/d77d6d3d-274d-46a3-b884-9acbd81526b2-kube-api-access-77hdh\") pod \"d77d6d3d-274d-46a3-b884-9acbd81526b2\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.976804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data\") pod \"d77d6d3d-274d-46a3-b884-9acbd81526b2\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.976903 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-combined-ca-bundle\") pod \"d77d6d3d-274d-46a3-b884-9acbd81526b2\" (UID: \"d77d6d3d-274d-46a3-b884-9acbd81526b2\") " Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.984169 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d77d6d3d-274d-46a3-b884-9acbd81526b2" (UID: "d77d6d3d-274d-46a3-b884-9acbd81526b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:37 crc kubenswrapper[4909]: I1002 18:41:37.984554 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77d6d3d-274d-46a3-b884-9acbd81526b2-kube-api-access-77hdh" (OuterVolumeSpecName: "kube-api-access-77hdh") pod "d77d6d3d-274d-46a3-b884-9acbd81526b2" (UID: "d77d6d3d-274d-46a3-b884-9acbd81526b2"). InnerVolumeSpecName "kube-api-access-77hdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.016150 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d77d6d3d-274d-46a3-b884-9acbd81526b2" (UID: "d77d6d3d-274d-46a3-b884-9acbd81526b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.056493 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data" (OuterVolumeSpecName: "config-data") pod "d77d6d3d-274d-46a3-b884-9acbd81526b2" (UID: "d77d6d3d-274d-46a3-b884-9acbd81526b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.079634 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.079690 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.079701 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77hdh\" (UniqueName: \"kubernetes.io/projected/d77d6d3d-274d-46a3-b884-9acbd81526b2-kube-api-access-77hdh\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.079713 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77d6d3d-274d-46a3-b884-9acbd81526b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.270598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerStarted","Data":"f4352c24a607cd1d9ae9fbafdd3361907887c25ed3a41b25136dab10de75ac3a"} Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.272944 4909 generic.go:334] "Generic (PLEG): container finished" podID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" exitCode=0 Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.273011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78c47d4bdf-9rk8w" event={"ID":"d77d6d3d-274d-46a3-b884-9acbd81526b2","Type":"ContainerDied","Data":"1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2"} Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.273052 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78c47d4bdf-9rk8w" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.273077 4909 scope.go:117] "RemoveContainer" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.273063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78c47d4bdf-9rk8w" event={"ID":"d77d6d3d-274d-46a3-b884-9acbd81526b2","Type":"ContainerDied","Data":"c4fcb341ba35e6870e5064210058d8907997988361beb0ea5e5be7cdf95a8771"} Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.314047 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-78c47d4bdf-9rk8w"] Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.315667 4909 scope.go:117] "RemoveContainer" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" Oct 02 18:41:38 crc kubenswrapper[4909]: E1002 18:41:38.316066 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2\": container with ID starting with 1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2 not found: ID does not exist" containerID="1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.316118 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2"} err="failed to get container status \"1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2\": rpc error: code = NotFound desc = could not find container \"1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2\": container with ID starting with 1689fbb234ddc3757bc9421370f52683a0df7bba401256830b063df95a3b8cc2 not found: ID does not exist" Oct 02 18:41:38 crc kubenswrapper[4909]: I1002 18:41:38.324185 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-78c47d4bdf-9rk8w"] Oct 02 18:41:39 crc kubenswrapper[4909]: I1002 18:41:39.288348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerStarted","Data":"950f83e830d5a9dde46b43477917f8a1c0e74b6dbd742a0519f7e8a3c3f990fa"} Oct 02 18:41:39 crc kubenswrapper[4909]: I1002 18:41:39.620546 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" path="/var/lib/kubelet/pods/d77d6d3d-274d-46a3-b884-9acbd81526b2/volumes" Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.313055 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerStarted","Data":"f7d7f63457d4cd240f922b45519a51521f710396ff04960a2a106aff7342e881"} Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.313196 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-central-agent" containerID="cri-o://29c3c6affc72c4b762093df2f02bca20a7be58f002a17fc28d71b7befed87b2b" gracePeriod=30 Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.313760 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="proxy-httpd" containerID="cri-o://f7d7f63457d4cd240f922b45519a51521f710396ff04960a2a106aff7342e881" gracePeriod=30 Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.313777 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="sg-core" containerID="cri-o://950f83e830d5a9dde46b43477917f8a1c0e74b6dbd742a0519f7e8a3c3f990fa" gracePeriod=30 Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.313847 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-notification-agent" containerID="cri-o://f4352c24a607cd1d9ae9fbafdd3361907887c25ed3a41b25136dab10de75ac3a" gracePeriod=30 Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.313910 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:41:41 crc kubenswrapper[4909]: I1002 18:41:41.349599 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4008115979999998 podStartE2EDuration="6.349562187s" podCreationTimestamp="2025-10-02 18:41:35 +0000 UTC" firstStartedPulling="2025-10-02 18:41:36.441792116 +0000 UTC m=+1417.629287975" lastFinishedPulling="2025-10-02 18:41:40.390542705 +0000 UTC m=+1421.578038564" observedRunningTime="2025-10-02 18:41:41.338606872 +0000 UTC m=+1422.526102731" watchObservedRunningTime="2025-10-02 18:41:41.349562187 +0000 UTC m=+1422.537058046" Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.341504 4909 generic.go:334] "Generic (PLEG): container finished" podID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerID="f7d7f63457d4cd240f922b45519a51521f710396ff04960a2a106aff7342e881" exitCode=0 Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.341953 4909 generic.go:334] "Generic (PLEG): container finished" podID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerID="950f83e830d5a9dde46b43477917f8a1c0e74b6dbd742a0519f7e8a3c3f990fa" exitCode=2 Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.341966 4909 generic.go:334] "Generic (PLEG): container finished" podID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerID="f4352c24a607cd1d9ae9fbafdd3361907887c25ed3a41b25136dab10de75ac3a" exitCode=0 Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.341561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerDied","Data":"f7d7f63457d4cd240f922b45519a51521f710396ff04960a2a106aff7342e881"} Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.342008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerDied","Data":"950f83e830d5a9dde46b43477917f8a1c0e74b6dbd742a0519f7e8a3c3f990fa"} Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.342027 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerDied","Data":"f4352c24a607cd1d9ae9fbafdd3361907887c25ed3a41b25136dab10de75ac3a"} Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.863471 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-9gdjk"] Oct 02 18:41:42 crc kubenswrapper[4909]: E1002 18:41:42.864345 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerName="heat-engine" Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.864369 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerName="heat-engine" Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.864609 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77d6d3d-274d-46a3-b884-9acbd81526b2" containerName="heat-engine" Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.865401 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.881061 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-9gdjk"] Oct 02 18:41:42 crc kubenswrapper[4909]: I1002 18:41:42.986122 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9lhx\" (UniqueName: \"kubernetes.io/projected/db8967b4-4b6b-4de7-bd0f-29017cc79ca9-kube-api-access-h9lhx\") pod \"aodh-db-create-9gdjk\" (UID: \"db8967b4-4b6b-4de7-bd0f-29017cc79ca9\") " pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:43 crc kubenswrapper[4909]: I1002 18:41:43.088482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9lhx\" (UniqueName: \"kubernetes.io/projected/db8967b4-4b6b-4de7-bd0f-29017cc79ca9-kube-api-access-h9lhx\") pod \"aodh-db-create-9gdjk\" (UID: \"db8967b4-4b6b-4de7-bd0f-29017cc79ca9\") " pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:43 crc kubenswrapper[4909]: I1002 18:41:43.116506 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9lhx\" (UniqueName: \"kubernetes.io/projected/db8967b4-4b6b-4de7-bd0f-29017cc79ca9-kube-api-access-h9lhx\") pod \"aodh-db-create-9gdjk\" (UID: \"db8967b4-4b6b-4de7-bd0f-29017cc79ca9\") " pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:43 crc kubenswrapper[4909]: I1002 18:41:43.186189 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:43 crc kubenswrapper[4909]: I1002 18:41:43.680712 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-9gdjk"] Oct 02 18:41:44 crc kubenswrapper[4909]: I1002 18:41:44.372226 4909 generic.go:334] "Generic (PLEG): container finished" podID="db8967b4-4b6b-4de7-bd0f-29017cc79ca9" containerID="268d53a819a1f6e76c1c0686344787a16b60cb3419710ead08d8f36cf7ee8793" exitCode=0 Oct 02 18:41:44 crc kubenswrapper[4909]: I1002 18:41:44.372350 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-9gdjk" event={"ID":"db8967b4-4b6b-4de7-bd0f-29017cc79ca9","Type":"ContainerDied","Data":"268d53a819a1f6e76c1c0686344787a16b60cb3419710ead08d8f36cf7ee8793"} Oct 02 18:41:44 crc kubenswrapper[4909]: I1002 18:41:44.372559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-9gdjk" event={"ID":"db8967b4-4b6b-4de7-bd0f-29017cc79ca9","Type":"ContainerStarted","Data":"4769407dfd4dcfa88158c61416fe48000c4f2a40c6d24b5d1c87134f1e47e4ea"} Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.383632 4909 generic.go:334] "Generic (PLEG): container finished" podID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerID="29c3c6affc72c4b762093df2f02bca20a7be58f002a17fc28d71b7befed87b2b" exitCode=0 Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.383698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerDied","Data":"29c3c6affc72c4b762093df2f02bca20a7be58f002a17fc28d71b7befed87b2b"} Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.763175 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.862998 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-log-httpd\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.863106 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-scripts\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.863144 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-config-data\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.863196 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-combined-ca-bundle\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.863257 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5np9\" (UniqueName: \"kubernetes.io/projected/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-kube-api-access-z5np9\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.863372 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-run-httpd\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.863401 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-sg-core-conf-yaml\") pod \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\" (UID: \"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0\") " Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.864401 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.864522 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.869523 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-kube-api-access-z5np9" (OuterVolumeSpecName: "kube-api-access-z5np9") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "kube-api-access-z5np9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.872152 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-scripts" (OuterVolumeSpecName: "scripts") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.889009 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.909319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.957808 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.965626 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.965691 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.965709 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5np9\" (UniqueName: \"kubernetes.io/projected/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-kube-api-access-z5np9\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.965744 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.965759 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:45 crc kubenswrapper[4909]: I1002 18:41:45.965769 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.005314 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-config-data" (OuterVolumeSpecName: "config-data") pod "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" (UID: "5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.067253 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9lhx\" (UniqueName: \"kubernetes.io/projected/db8967b4-4b6b-4de7-bd0f-29017cc79ca9-kube-api-access-h9lhx\") pod \"db8967b4-4b6b-4de7-bd0f-29017cc79ca9\" (UID: \"db8967b4-4b6b-4de7-bd0f-29017cc79ca9\") " Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.067942 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.074000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8967b4-4b6b-4de7-bd0f-29017cc79ca9-kube-api-access-h9lhx" (OuterVolumeSpecName: "kube-api-access-h9lhx") pod "db8967b4-4b6b-4de7-bd0f-29017cc79ca9" (UID: "db8967b4-4b6b-4de7-bd0f-29017cc79ca9"). InnerVolumeSpecName "kube-api-access-h9lhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.169556 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9lhx\" (UniqueName: \"kubernetes.io/projected/db8967b4-4b6b-4de7-bd0f-29017cc79ca9-kube-api-access-h9lhx\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.401601 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0","Type":"ContainerDied","Data":"4d5b944ddda930c70b568169d8c5c32b535c95fe4f87e8882577eba773d502ed"} Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.401989 4909 scope.go:117] "RemoveContainer" containerID="f7d7f63457d4cd240f922b45519a51521f710396ff04960a2a106aff7342e881" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.402189 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.410119 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-9gdjk" event={"ID":"db8967b4-4b6b-4de7-bd0f-29017cc79ca9","Type":"ContainerDied","Data":"4769407dfd4dcfa88158c61416fe48000c4f2a40c6d24b5d1c87134f1e47e4ea"} Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.410353 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4769407dfd4dcfa88158c61416fe48000c4f2a40c6d24b5d1c87134f1e47e4ea" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.410323 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-9gdjk" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.464191 4909 scope.go:117] "RemoveContainer" containerID="950f83e830d5a9dde46b43477917f8a1c0e74b6dbd742a0519f7e8a3c3f990fa" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.496091 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.527977 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.535238 4909 scope.go:117] "RemoveContainer" containerID="f4352c24a607cd1d9ae9fbafdd3361907887c25ed3a41b25136dab10de75ac3a" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.567277 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:46 crc kubenswrapper[4909]: E1002 18:41:46.567946 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-notification-agent" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.567970 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-notification-agent" Oct 02 18:41:46 crc kubenswrapper[4909]: E1002 18:41:46.567996 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="sg-core" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568004 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="sg-core" Oct 02 18:41:46 crc kubenswrapper[4909]: E1002 18:41:46.568045 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8967b4-4b6b-4de7-bd0f-29017cc79ca9" containerName="mariadb-database-create" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568056 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8967b4-4b6b-4de7-bd0f-29017cc79ca9" containerName="mariadb-database-create" Oct 02 18:41:46 crc kubenswrapper[4909]: E1002 18:41:46.568080 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="proxy-httpd" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568088 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="proxy-httpd" Oct 02 18:41:46 crc kubenswrapper[4909]: E1002 18:41:46.568102 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-central-agent" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568110 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-central-agent" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568364 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-notification-agent" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568398 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8967b4-4b6b-4de7-bd0f-29017cc79ca9" containerName="mariadb-database-create" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568414 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="ceilometer-central-agent" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568429 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="proxy-httpd" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.568443 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" containerName="sg-core" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.576573 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.581984 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.582243 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.606435 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.659471 4909 scope.go:117] "RemoveContainer" containerID="29c3c6affc72c4b762093df2f02bca20a7be58f002a17fc28d71b7befed87b2b" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698013 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp987\" (UniqueName: \"kubernetes.io/projected/69f66d6a-589f-4c63-9f82-84bbeee56676-kube-api-access-bp987\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698162 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-run-httpd\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698205 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-log-httpd\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698252 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-scripts\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698311 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.698375 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-config-data\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800447 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-config-data\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp987\" (UniqueName: \"kubernetes.io/projected/69f66d6a-589f-4c63-9f82-84bbeee56676-kube-api-access-bp987\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-run-httpd\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800707 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-log-httpd\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.800751 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-scripts\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.803014 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-run-httpd\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.803391 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-log-httpd\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.805518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.805559 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-scripts\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.806811 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-config-data\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.807949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.827528 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp987\" (UniqueName: \"kubernetes.io/projected/69f66d6a-589f-4c63-9f82-84bbeee56676-kube-api-access-bp987\") pod \"ceilometer-0\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " pod="openstack/ceilometer-0" Oct 02 18:41:46 crc kubenswrapper[4909]: I1002 18:41:46.935910 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:41:48 crc kubenswrapper[4909]: W1002 18:41:47.431337 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f66d6a_589f_4c63_9f82_84bbeee56676.slice/crio-ec9ae6a8e7ee201d2056cb9987b8ee4e08317073aaae194a40c59214997db19d WatchSource:0}: Error finding container ec9ae6a8e7ee201d2056cb9987b8ee4e08317073aaae194a40c59214997db19d: Status 404 returned error can't find the container with id ec9ae6a8e7ee201d2056cb9987b8ee4e08317073aaae194a40c59214997db19d Oct 02 18:41:48 crc kubenswrapper[4909]: I1002 18:41:47.433257 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:41:48 crc kubenswrapper[4909]: I1002 18:41:47.623647 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0" path="/var/lib/kubelet/pods/5f9260f6-4b1e-4dbc-94db-1f1ce9ae52d0/volumes" Oct 02 18:41:48 crc kubenswrapper[4909]: I1002 18:41:48.435478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerStarted","Data":"c6a801bc60f11ab0c0662b7f78d788419233f8d0754312bb33cdee33c2f0185e"} Oct 02 18:41:48 crc kubenswrapper[4909]: I1002 18:41:48.435904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerStarted","Data":"ec9ae6a8e7ee201d2056cb9987b8ee4e08317073aaae194a40c59214997db19d"} Oct 02 18:41:48 crc kubenswrapper[4909]: I1002 18:41:48.458412 4909 generic.go:334] "Generic (PLEG): container finished" podID="68c9f259-4615-48e3-9c0c-89c320e100e0" containerID="c192334de91920dea5030d7f865209eacf5b8e42b49b13ee26662b61c15d9117" exitCode=0 Oct 02 18:41:48 crc kubenswrapper[4909]: I1002 18:41:48.458540 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" event={"ID":"68c9f259-4615-48e3-9c0c-89c320e100e0","Type":"ContainerDied","Data":"c192334de91920dea5030d7f865209eacf5b8e42b49b13ee26662b61c15d9117"} Oct 02 18:41:49 crc kubenswrapper[4909]: I1002 18:41:49.471071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerStarted","Data":"3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d"} Oct 02 18:41:49 crc kubenswrapper[4909]: I1002 18:41:49.935025 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.071767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-scripts\") pod \"68c9f259-4615-48e3-9c0c-89c320e100e0\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.072192 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-combined-ca-bundle\") pod \"68c9f259-4615-48e3-9c0c-89c320e100e0\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.072364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-config-data\") pod \"68c9f259-4615-48e3-9c0c-89c320e100e0\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.072544 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqwp\" (UniqueName: \"kubernetes.io/projected/68c9f259-4615-48e3-9c0c-89c320e100e0-kube-api-access-bpqwp\") pod \"68c9f259-4615-48e3-9c0c-89c320e100e0\" (UID: \"68c9f259-4615-48e3-9c0c-89c320e100e0\") " Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.078165 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-scripts" (OuterVolumeSpecName: "scripts") pod "68c9f259-4615-48e3-9c0c-89c320e100e0" (UID: "68c9f259-4615-48e3-9c0c-89c320e100e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.085193 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c9f259-4615-48e3-9c0c-89c320e100e0-kube-api-access-bpqwp" (OuterVolumeSpecName: "kube-api-access-bpqwp") pod "68c9f259-4615-48e3-9c0c-89c320e100e0" (UID: "68c9f259-4615-48e3-9c0c-89c320e100e0"). InnerVolumeSpecName "kube-api-access-bpqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.104898 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68c9f259-4615-48e3-9c0c-89c320e100e0" (UID: "68c9f259-4615-48e3-9c0c-89c320e100e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.105530 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-config-data" (OuterVolumeSpecName: "config-data") pod "68c9f259-4615-48e3-9c0c-89c320e100e0" (UID: "68c9f259-4615-48e3-9c0c-89c320e100e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.175998 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpqwp\" (UniqueName: \"kubernetes.io/projected/68c9f259-4615-48e3-9c0c-89c320e100e0-kube-api-access-bpqwp\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.176046 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.176056 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.176064 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c9f259-4615-48e3-9c0c-89c320e100e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.492064 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" event={"ID":"68c9f259-4615-48e3-9c0c-89c320e100e0","Type":"ContainerDied","Data":"f5aa29b7ed224d54dc30a882d282198fda23721106f7750c3fd1141ad6d1d001"} Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.492136 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5aa29b7ed224d54dc30a882d282198fda23721106f7750c3fd1141ad6d1d001" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.492227 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tjnr" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.499898 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerStarted","Data":"ba435c93015d0d32f1ba47dc65036de355fb859da9b8d9f5b78d8c4e6f712460"} Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.591388 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 18:41:50 crc kubenswrapper[4909]: E1002 18:41:50.591779 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c9f259-4615-48e3-9c0c-89c320e100e0" containerName="nova-cell0-conductor-db-sync" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.591793 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c9f259-4615-48e3-9c0c-89c320e100e0" containerName="nova-cell0-conductor-db-sync" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.591996 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c9f259-4615-48e3-9c0c-89c320e100e0" containerName="nova-cell0-conductor-db-sync" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.592675 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.596316 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-svq85" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.596600 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.606144 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.685329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.685623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.685709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7pf\" (UniqueName: \"kubernetes.io/projected/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-kube-api-access-lj7pf\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.786873 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.787607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.787680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7pf\" (UniqueName: \"kubernetes.io/projected/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-kube-api-access-lj7pf\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.791523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.793715 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.812441 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7pf\" (UniqueName: \"kubernetes.io/projected/3a482be6-ee1c-4b4b-a1fe-e05813afe8c1-kube-api-access-lj7pf\") pod \"nova-cell0-conductor-0\" (UID: \"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1\") " pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:50 crc kubenswrapper[4909]: I1002 18:41:50.911254 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:51 crc kubenswrapper[4909]: I1002 18:41:51.401392 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 18:41:51 crc kubenswrapper[4909]: W1002 18:41:51.407738 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a482be6_ee1c_4b4b_a1fe_e05813afe8c1.slice/crio-7220932d681d1bb3fc527781ace2a0cc531e2291318e86bbfa57d018b2410c8e WatchSource:0}: Error finding container 7220932d681d1bb3fc527781ace2a0cc531e2291318e86bbfa57d018b2410c8e: Status 404 returned error can't find the container with id 7220932d681d1bb3fc527781ace2a0cc531e2291318e86bbfa57d018b2410c8e Oct 02 18:41:51 crc kubenswrapper[4909]: I1002 18:41:51.514485 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1","Type":"ContainerStarted","Data":"7220932d681d1bb3fc527781ace2a0cc531e2291318e86bbfa57d018b2410c8e"} Oct 02 18:41:51 crc kubenswrapper[4909]: I1002 18:41:51.518981 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerStarted","Data":"18fd289cab681d1b85aee0fa477341f650d2ec2921c3848b71442c545bec6afb"} Oct 02 18:41:51 crc kubenswrapper[4909]: I1002 18:41:51.519239 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:41:51 crc kubenswrapper[4909]: I1002 18:41:51.552196 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2964564210000002 podStartE2EDuration="5.552173887s" podCreationTimestamp="2025-10-02 18:41:46 +0000 UTC" firstStartedPulling="2025-10-02 18:41:47.43492697 +0000 UTC m=+1428.622422829" lastFinishedPulling="2025-10-02 18:41:50.690644436 +0000 UTC m=+1431.878140295" observedRunningTime="2025-10-02 18:41:51.544112473 +0000 UTC m=+1432.731608332" watchObservedRunningTime="2025-10-02 18:41:51.552173887 +0000 UTC m=+1432.739669746" Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.532858 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a482be6-ee1c-4b4b-a1fe-e05813afe8c1","Type":"ContainerStarted","Data":"7294bd8a5fde7b10be1d0b4a897345c35ce2025b6ff61c025fd6cd4367d67912"} Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.533282 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.560865 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.560838304 podStartE2EDuration="2.560838304s" podCreationTimestamp="2025-10-02 18:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:41:52.553364088 +0000 UTC m=+1433.740859947" watchObservedRunningTime="2025-10-02 18:41:52.560838304 +0000 UTC m=+1433.748334183" Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.971225 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-d265-account-create-nxdxn"] Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.973560 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.987400 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 02 18:41:52 crc kubenswrapper[4909]: I1002 18:41:52.992397 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-d265-account-create-nxdxn"] Oct 02 18:41:53 crc kubenswrapper[4909]: I1002 18:41:53.033646 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247gw\" (UniqueName: \"kubernetes.io/projected/9d54c6c8-3bf0-462b-b69a-fd242e123ae5-kube-api-access-247gw\") pod \"aodh-d265-account-create-nxdxn\" (UID: \"9d54c6c8-3bf0-462b-b69a-fd242e123ae5\") " pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:53 crc kubenswrapper[4909]: I1002 18:41:53.135572 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247gw\" (UniqueName: \"kubernetes.io/projected/9d54c6c8-3bf0-462b-b69a-fd242e123ae5-kube-api-access-247gw\") pod \"aodh-d265-account-create-nxdxn\" (UID: \"9d54c6c8-3bf0-462b-b69a-fd242e123ae5\") " pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:53 crc kubenswrapper[4909]: I1002 18:41:53.164930 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247gw\" (UniqueName: \"kubernetes.io/projected/9d54c6c8-3bf0-462b-b69a-fd242e123ae5-kube-api-access-247gw\") pod \"aodh-d265-account-create-nxdxn\" (UID: \"9d54c6c8-3bf0-462b-b69a-fd242e123ae5\") " pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:53 crc kubenswrapper[4909]: I1002 18:41:53.300277 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:53 crc kubenswrapper[4909]: I1002 18:41:53.815066 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-d265-account-create-nxdxn"] Oct 02 18:41:53 crc kubenswrapper[4909]: W1002 18:41:53.825612 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d54c6c8_3bf0_462b_b69a_fd242e123ae5.slice/crio-c993eb8ca724fd3b6e8ec52aa2ced6fd2acec8b0adc28a98186f64969dbf1c6c WatchSource:0}: Error finding container c993eb8ca724fd3b6e8ec52aa2ced6fd2acec8b0adc28a98186f64969dbf1c6c: Status 404 returned error can't find the container with id c993eb8ca724fd3b6e8ec52aa2ced6fd2acec8b0adc28a98186f64969dbf1c6c Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.432578 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w24fx"] Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.435203 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.448892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w24fx"] Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.551613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d265-account-create-nxdxn" event={"ID":"9d54c6c8-3bf0-462b-b69a-fd242e123ae5","Type":"ContainerStarted","Data":"465296b3e3c237d3e9f2a6061f2ea5c13459c17911c91bd98391c2cc51d3789a"} Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.551675 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d265-account-create-nxdxn" event={"ID":"9d54c6c8-3bf0-462b-b69a-fd242e123ae5","Type":"ContainerStarted","Data":"c993eb8ca724fd3b6e8ec52aa2ced6fd2acec8b0adc28a98186f64969dbf1c6c"} Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.562684 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzlwc\" (UniqueName: \"kubernetes.io/projected/7d99b60c-a7c5-40be-97a9-e424214bd40e-kube-api-access-dzlwc\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.562772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-catalog-content\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.562850 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-utilities\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.664987 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzlwc\" (UniqueName: \"kubernetes.io/projected/7d99b60c-a7c5-40be-97a9-e424214bd40e-kube-api-access-dzlwc\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.665404 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-catalog-content\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.665960 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-catalog-content\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.666149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-utilities\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.666663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-utilities\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.686260 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzlwc\" (UniqueName: \"kubernetes.io/projected/7d99b60c-a7c5-40be-97a9-e424214bd40e-kube-api-access-dzlwc\") pod \"community-operators-w24fx\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:54 crc kubenswrapper[4909]: I1002 18:41:54.761132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:41:55 crc kubenswrapper[4909]: I1002 18:41:55.355645 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w24fx"] Oct 02 18:41:55 crc kubenswrapper[4909]: I1002 18:41:55.569713 4909 generic.go:334] "Generic (PLEG): container finished" podID="9d54c6c8-3bf0-462b-b69a-fd242e123ae5" containerID="465296b3e3c237d3e9f2a6061f2ea5c13459c17911c91bd98391c2cc51d3789a" exitCode=0 Oct 02 18:41:55 crc kubenswrapper[4909]: I1002 18:41:55.569779 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d265-account-create-nxdxn" event={"ID":"9d54c6c8-3bf0-462b-b69a-fd242e123ae5","Type":"ContainerDied","Data":"465296b3e3c237d3e9f2a6061f2ea5c13459c17911c91bd98391c2cc51d3789a"} Oct 02 18:41:55 crc kubenswrapper[4909]: I1002 18:41:55.576658 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerStarted","Data":"9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451"} Oct 02 18:41:55 crc kubenswrapper[4909]: I1002 18:41:55.576705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerStarted","Data":"acc886f264b233fe4a79530bd4cb3f3c2927f63c6c58e10a3b6aaabb0c4d58c4"} Oct 02 18:41:56 crc kubenswrapper[4909]: I1002 18:41:56.602118 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerID="9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451" exitCode=0 Oct 02 18:41:56 crc kubenswrapper[4909]: I1002 18:41:56.603160 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerDied","Data":"9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451"} Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.090807 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.245335 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247gw\" (UniqueName: \"kubernetes.io/projected/9d54c6c8-3bf0-462b-b69a-fd242e123ae5-kube-api-access-247gw\") pod \"9d54c6c8-3bf0-462b-b69a-fd242e123ae5\" (UID: \"9d54c6c8-3bf0-462b-b69a-fd242e123ae5\") " Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.279097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d54c6c8-3bf0-462b-b69a-fd242e123ae5-kube-api-access-247gw" (OuterVolumeSpecName: "kube-api-access-247gw") pod "9d54c6c8-3bf0-462b-b69a-fd242e123ae5" (UID: "9d54c6c8-3bf0-462b-b69a-fd242e123ae5"). InnerVolumeSpecName "kube-api-access-247gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.347886 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247gw\" (UniqueName: \"kubernetes.io/projected/9d54c6c8-3bf0-462b-b69a-fd242e123ae5-kube-api-access-247gw\") on node \"crc\" DevicePath \"\"" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.634256 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-d265-account-create-nxdxn" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.639840 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-d265-account-create-nxdxn" event={"ID":"9d54c6c8-3bf0-462b-b69a-fd242e123ae5","Type":"ContainerDied","Data":"c993eb8ca724fd3b6e8ec52aa2ced6fd2acec8b0adc28a98186f64969dbf1c6c"} Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.639891 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c993eb8ca724fd3b6e8ec52aa2ced6fd2acec8b0adc28a98186f64969dbf1c6c" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.639905 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tb477"] Oct 02 18:41:57 crc kubenswrapper[4909]: E1002 18:41:57.640307 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d54c6c8-3bf0-462b-b69a-fd242e123ae5" containerName="mariadb-account-create" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.640329 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d54c6c8-3bf0-462b-b69a-fd242e123ae5" containerName="mariadb-account-create" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.640641 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d54c6c8-3bf0-462b-b69a-fd242e123ae5" containerName="mariadb-account-create" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.642692 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tb477"] Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.642807 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.755116 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-utilities\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.755227 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4mn\" (UniqueName: \"kubernetes.io/projected/cb76447a-6677-4bf2-9775-e273f1524050-kube-api-access-8j4mn\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.755270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-catalog-content\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.857022 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-utilities\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.857146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4mn\" (UniqueName: \"kubernetes.io/projected/cb76447a-6677-4bf2-9775-e273f1524050-kube-api-access-8j4mn\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.857187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-catalog-content\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.857765 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-utilities\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.857808 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-catalog-content\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.880376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4mn\" (UniqueName: \"kubernetes.io/projected/cb76447a-6677-4bf2-9775-e273f1524050-kube-api-access-8j4mn\") pod \"certified-operators-tb477\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:57 crc kubenswrapper[4909]: I1002 18:41:57.977727 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:41:58 crc kubenswrapper[4909]: I1002 18:41:58.512930 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tb477"] Oct 02 18:41:58 crc kubenswrapper[4909]: I1002 18:41:58.646452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerStarted","Data":"8eb35fe3063724b9530c85f5af7a5fc1663e46bf522f95e05585b60a51ae3c8f"} Oct 02 18:41:59 crc kubenswrapper[4909]: I1002 18:41:59.665765 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerID="e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5" exitCode=0 Oct 02 18:41:59 crc kubenswrapper[4909]: I1002 18:41:59.666168 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerDied","Data":"e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5"} Oct 02 18:41:59 crc kubenswrapper[4909]: I1002 18:41:59.671080 4909 generic.go:334] "Generic (PLEG): container finished" podID="cb76447a-6677-4bf2-9775-e273f1524050" containerID="878906bdc0866c50a4d90e1a24224d4ec7aa73ae7c8e18ad8ea3a5ddaf0e1998" exitCode=0 Oct 02 18:41:59 crc kubenswrapper[4909]: I1002 18:41:59.671165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerDied","Data":"878906bdc0866c50a4d90e1a24224d4ec7aa73ae7c8e18ad8ea3a5ddaf0e1998"} Oct 02 18:42:00 crc kubenswrapper[4909]: I1002 18:42:00.955773 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.432129 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xlfn6"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.434121 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.437291 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.437501 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.441345 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlfn6"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.541839 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54pw\" (UniqueName: \"kubernetes.io/projected/57665172-f334-4a30-b3d0-60b0a50183e9-kube-api-access-z54pw\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.541908 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-scripts\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.542233 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.542312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-config-data\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.654329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.658680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-config-data\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.658940 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54pw\" (UniqueName: \"kubernetes.io/projected/57665172-f334-4a30-b3d0-60b0a50183e9-kube-api-access-z54pw\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.659131 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-scripts\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.676345 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-config-data\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.684247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.730822 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.734005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.738225 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-scripts\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.738472 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54pw\" (UniqueName: \"kubernetes.io/projected/57665172-f334-4a30-b3d0-60b0a50183e9-kube-api-access-z54pw\") pod \"nova-cell0-cell-mapping-xlfn6\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.762884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-logs\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.763040 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhvb\" (UniqueName: \"kubernetes.io/projected/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-kube-api-access-lkhvb\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.763155 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.763234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-config-data\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.782697 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.784998 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.799150 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.809179 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.814089 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.816456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerStarted","Data":"c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5"} Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.830118 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.845001 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.862620 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.865120 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerStarted","Data":"a2229e5e472877011137da7d21b00fcba7df94273f66642e5cdab84fcb0e8f0f"} Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.866087 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.878366 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.893114 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.894841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhvb\" (UniqueName: \"kubernetes.io/projected/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-kube-api-access-lkhvb\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-config-data\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905812 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwqc8\" (UniqueName: \"kubernetes.io/projected/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-kube-api-access-rwqc8\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905846 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905928 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.905974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-logs\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.906345 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-logs\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.913165 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.914305 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.916140 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.919062 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-config-data\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.945597 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhvb\" (UniqueName: \"kubernetes.io/projected/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-kube-api-access-lkhvb\") pod \"nova-api-0\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " pod="openstack/nova-api-0" Oct 02 18:42:01 crc kubenswrapper[4909]: I1002 18:42:01.949727 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.010923 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556gr\" (UniqueName: \"kubernetes.io/projected/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-kube-api-access-556gr\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwqc8\" (UniqueName: \"kubernetes.io/projected/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-kube-api-access-rwqc8\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011139 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-config-data\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011175 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011242 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51695048-17a4-40ae-bbfd-9dc1f27233fe-logs\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011278 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-config-data\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqtx\" (UniqueName: \"kubernetes.io/projected/51695048-17a4-40ae-bbfd-9dc1f27233fe-kube-api-access-4wqtx\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.011462 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.017052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.017504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.030975 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w24fx" podStartSLOduration=3.704291843 podStartE2EDuration="8.03095608s" podCreationTimestamp="2025-10-02 18:41:54 +0000 UTC" firstStartedPulling="2025-10-02 18:41:56.603959118 +0000 UTC m=+1437.791454977" lastFinishedPulling="2025-10-02 18:42:00.930623355 +0000 UTC m=+1442.118119214" observedRunningTime="2025-10-02 18:42:01.914626425 +0000 UTC m=+1443.102122284" watchObservedRunningTime="2025-10-02 18:42:02.03095608 +0000 UTC m=+1443.218451939" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.059875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwqc8\" (UniqueName: \"kubernetes.io/projected/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-kube-api-access-rwqc8\") pod \"nova-cell1-novncproxy-0\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.097225 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-h9fp2"] Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.099010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.103431 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.117864 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqtx\" (UniqueName: \"kubernetes.io/projected/51695048-17a4-40ae-bbfd-9dc1f27233fe-kube-api-access-4wqtx\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.117970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.118039 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556gr\" (UniqueName: \"kubernetes.io/projected/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-kube-api-access-556gr\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.118178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-config-data\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.118292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51695048-17a4-40ae-bbfd-9dc1f27233fe-logs\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.118323 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.118386 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-config-data\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.118876 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-h9fp2"] Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.119441 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51695048-17a4-40ae-bbfd-9dc1f27233fe-logs\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.127305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.127781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-config-data\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.127792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.127838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-config-data\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.133939 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.143555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqtx\" (UniqueName: \"kubernetes.io/projected/51695048-17a4-40ae-bbfd-9dc1f27233fe-kube-api-access-4wqtx\") pod \"nova-metadata-0\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.174848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556gr\" (UniqueName: \"kubernetes.io/projected/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-kube-api-access-556gr\") pod \"nova-scheduler-0\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.220653 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-config\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.220715 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.220863 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.220904 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9qr\" (UniqueName: \"kubernetes.io/projected/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-kube-api-access-7d9qr\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.220929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.220986 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.238944 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.268574 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.336607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.336713 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9qr\" (UniqueName: \"kubernetes.io/projected/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-kube-api-access-7d9qr\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.336763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.336835 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.336874 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-config\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.336902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.338178 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.338891 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.343230 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.355168 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-config\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.356749 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.381913 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9qr\" (UniqueName: \"kubernetes.io/projected/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-kube-api-access-7d9qr\") pod \"dnsmasq-dns-5fbc4d444f-h9fp2\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.455546 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.588083 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlfn6"] Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.917160 4909 generic.go:334] "Generic (PLEG): container finished" podID="cb76447a-6677-4bf2-9775-e273f1524050" containerID="a2229e5e472877011137da7d21b00fcba7df94273f66642e5cdab84fcb0e8f0f" exitCode=0 Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.918410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerDied","Data":"a2229e5e472877011137da7d21b00fcba7df94273f66642e5cdab84fcb0e8f0f"} Oct 02 18:42:02 crc kubenswrapper[4909]: I1002 18:42:02.930997 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlfn6" event={"ID":"57665172-f334-4a30-b3d0-60b0a50183e9","Type":"ContainerStarted","Data":"ee8aee27f989682a55fb0be59df904257b57403b6b91cb8f4c5e46a9d1eafdae"} Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.485066 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-425k4"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.487965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.491523 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-t2tfr" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.491793 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.491947 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.523532 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-425k4"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.581347 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjlw\" (UniqueName: \"kubernetes.io/projected/63441e27-e4a8-4db3-9c60-5cd65f0e136a-kube-api-access-tcjlw\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.581734 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-combined-ca-bundle\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.581788 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-scripts\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.581856 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-config-data\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.684801 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-combined-ca-bundle\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.685521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-scripts\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.685687 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-config-data\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.685832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjlw\" (UniqueName: \"kubernetes.io/projected/63441e27-e4a8-4db3-9c60-5cd65f0e136a-kube-api-access-tcjlw\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: W1002 18:42:03.696800 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbb7a8f4_dd8d_4940_a84e_28364cb8c285.slice/crio-c407cb301a87ef69454df0e66a0a97408004d5c52802821ad8bedb32697ca1ef WatchSource:0}: Error finding container c407cb301a87ef69454df0e66a0a97408004d5c52802821ad8bedb32697ca1ef: Status 404 returned error can't find the container with id c407cb301a87ef69454df0e66a0a97408004d5c52802821ad8bedb32697ca1ef Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.764149 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.764206 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.764218 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-h9fp2"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.772591 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjlw\" (UniqueName: \"kubernetes.io/projected/63441e27-e4a8-4db3-9c60-5cd65f0e136a-kube-api-access-tcjlw\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.772964 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-scripts\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.774496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-config-data\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.797677 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-combined-ca-bundle\") pod \"aodh-db-sync-425k4\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.807628 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:03 crc kubenswrapper[4909]: W1002 18:42:03.811214 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51695048_17a4_40ae_bbfd_9dc1f27233fe.slice/crio-affcaa742e51fa25cfb00bcc0a9ea10b01f74e612ab3ce794001a487af9d654b WatchSource:0}: Error finding container affcaa742e51fa25cfb00bcc0a9ea10b01f74e612ab3ce794001a487af9d654b: Status 404 returned error can't find the container with id affcaa742e51fa25cfb00bcc0a9ea10b01f74e612ab3ce794001a487af9d654b Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.900197 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vv8d"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.904295 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.906634 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.913621 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.923222 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vv8d"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.951928 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.961617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08c76dbf-4440-4ec1-98a3-14667d5c8a2a","Type":"ContainerStarted","Data":"53c81099f054963c0b842b16900cccd727e0b22fd5d06538f898d6c0a2fe5ab5"} Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.972223 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51695048-17a4-40ae-bbfd-9dc1f27233fe","Type":"ContainerStarted","Data":"affcaa742e51fa25cfb00bcc0a9ea10b01f74e612ab3ce794001a487af9d654b"} Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.979933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d","Type":"ContainerStarted","Data":"9fc53444c46e0c729f22dd938df6b949cf19ec7d6fddc2702af1bb0ce0088684"} Oct 02 18:42:03 crc kubenswrapper[4909]: I1002 18:42:03.986538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlfn6" event={"ID":"57665172-f334-4a30-b3d0-60b0a50183e9","Type":"ContainerStarted","Data":"2c9f9515b5d2294949cef5236454c43f368efcf326b1c5b3406e0771f524ec49"} Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.000145 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" event={"ID":"fbb7a8f4-dd8d-4940-a84e-28364cb8c285","Type":"ContainerStarted","Data":"c407cb301a87ef69454df0e66a0a97408004d5c52802821ad8bedb32697ca1ef"} Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.017076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qxnl\" (UniqueName: \"kubernetes.io/projected/e3230517-79be-47ce-83ca-f423e167c5f1-kube-api-access-4qxnl\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.017373 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.017418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-config-data\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.017470 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-scripts\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.021833 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xlfn6" podStartSLOduration=3.02181421 podStartE2EDuration="3.02181421s" podCreationTimestamp="2025-10-02 18:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:04.01735015 +0000 UTC m=+1445.204846009" watchObservedRunningTime="2025-10-02 18:42:04.02181421 +0000 UTC m=+1445.209310069" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.064398 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.119513 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qxnl\" (UniqueName: \"kubernetes.io/projected/e3230517-79be-47ce-83ca-f423e167c5f1-kube-api-access-4qxnl\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.119581 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.119625 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-config-data\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.119673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-scripts\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.130501 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-scripts\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.134588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-config-data\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.135786 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.151182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qxnl\" (UniqueName: \"kubernetes.io/projected/e3230517-79be-47ce-83ca-f423e167c5f1-kube-api-access-4qxnl\") pod \"nova-cell1-conductor-db-sync-5vv8d\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.231652 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.725994 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-425k4"] Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.762544 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:42:04 crc kubenswrapper[4909]: I1002 18:42:04.762577 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.019549 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vv8d"] Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.057766 4909 generic.go:334] "Generic (PLEG): container finished" podID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerID="6bda7cb8182b2bd0adc961780af2d54dd02409e69449543d8b01e2ff2c07c140" exitCode=0 Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.057868 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" event={"ID":"fbb7a8f4-dd8d-4940-a84e-28364cb8c285","Type":"ContainerDied","Data":"6bda7cb8182b2bd0adc961780af2d54dd02409e69449543d8b01e2ff2c07c140"} Oct 02 18:42:05 crc kubenswrapper[4909]: W1002 18:42:05.061592 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3230517_79be_47ce_83ca_f423e167c5f1.slice/crio-04e3cbdfa59cef665974464f1b89997be6a92c93bb7ca7054c8d40a4897da1bb WatchSource:0}: Error finding container 04e3cbdfa59cef665974464f1b89997be6a92c93bb7ca7054c8d40a4897da1bb: Status 404 returned error can't find the container with id 04e3cbdfa59cef665974464f1b89997be6a92c93bb7ca7054c8d40a4897da1bb Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.066383 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-425k4" event={"ID":"63441e27-e4a8-4db3-9c60-5cd65f0e136a","Type":"ContainerStarted","Data":"e09e6fee1bc22a1a621fc96405492f45405df32845f34305e787f66473fab947"} Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.073561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf","Type":"ContainerStarted","Data":"5e335229afc5bd687f7abc6d90326990acb1a0d3e8ded5f892991b85a8b005d8"} Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.116755 4909 generic.go:334] "Generic (PLEG): container finished" podID="20c64b6e-d451-4962-9793-532f6c31f79d" containerID="a9c7e2dcc9cfc4a50071740d1912151f9e585eb62535a42a7a72094b1eadd883" exitCode=137 Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.116946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" event={"ID":"20c64b6e-d451-4962-9793-532f6c31f79d","Type":"ContainerDied","Data":"a9c7e2dcc9cfc4a50071740d1912151f9e585eb62535a42a7a72094b1eadd883"} Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.143053 4909 generic.go:334] "Generic (PLEG): container finished" podID="25396a7c-449c-4891-9f05-80acf9ef5309" containerID="21d99a2ba58de400adfed2c828e3916fd93dc568857260758f92538f615cb8c5" exitCode=137 Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.143146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9f896796b-dt79t" event={"ID":"25396a7c-449c-4891-9f05-80acf9ef5309","Type":"ContainerDied","Data":"21d99a2ba58de400adfed2c828e3916fd93dc568857260758f92538f615cb8c5"} Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.238430 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tb477" podStartSLOduration=3.447154323 podStartE2EDuration="8.238406647s" podCreationTimestamp="2025-10-02 18:41:57 +0000 UTC" firstStartedPulling="2025-10-02 18:41:59.672573891 +0000 UTC m=+1440.860069760" lastFinishedPulling="2025-10-02 18:42:04.463826225 +0000 UTC m=+1445.651322084" observedRunningTime="2025-10-02 18:42:05.234147773 +0000 UTC m=+1446.421643642" watchObservedRunningTime="2025-10-02 18:42:05.238406647 +0000 UTC m=+1446.425902506" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.768327 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.851711 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.867661 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-combined-ca-bundle\") pod \"20c64b6e-d451-4962-9793-532f6c31f79d\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.867749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9rhw\" (UniqueName: \"kubernetes.io/projected/20c64b6e-d451-4962-9793-532f6c31f79d-kube-api-access-x9rhw\") pod \"20c64b6e-d451-4962-9793-532f6c31f79d\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.867783 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-combined-ca-bundle\") pod \"25396a7c-449c-4891-9f05-80acf9ef5309\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.867804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data\") pod \"20c64b6e-d451-4962-9793-532f6c31f79d\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.867907 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data-custom\") pod \"25396a7c-449c-4891-9f05-80acf9ef5309\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.868013 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whg68\" (UniqueName: \"kubernetes.io/projected/25396a7c-449c-4891-9f05-80acf9ef5309-kube-api-access-whg68\") pod \"25396a7c-449c-4891-9f05-80acf9ef5309\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.868055 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data\") pod \"25396a7c-449c-4891-9f05-80acf9ef5309\" (UID: \"25396a7c-449c-4891-9f05-80acf9ef5309\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.869155 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data-custom\") pod \"20c64b6e-d451-4962-9793-532f6c31f79d\" (UID: \"20c64b6e-d451-4962-9793-532f6c31f79d\") " Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.900905 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25396a7c-449c-4891-9f05-80acf9ef5309" (UID: "25396a7c-449c-4891-9f05-80acf9ef5309"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.901257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25396a7c-449c-4891-9f05-80acf9ef5309-kube-api-access-whg68" (OuterVolumeSpecName: "kube-api-access-whg68") pod "25396a7c-449c-4891-9f05-80acf9ef5309" (UID: "25396a7c-449c-4891-9f05-80acf9ef5309"). InnerVolumeSpecName "kube-api-access-whg68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.901533 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c64b6e-d451-4962-9793-532f6c31f79d-kube-api-access-x9rhw" (OuterVolumeSpecName: "kube-api-access-x9rhw") pod "20c64b6e-d451-4962-9793-532f6c31f79d" (UID: "20c64b6e-d451-4962-9793-532f6c31f79d"). InnerVolumeSpecName "kube-api-access-x9rhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.906738 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "20c64b6e-d451-4962-9793-532f6c31f79d" (UID: "20c64b6e-d451-4962-9793-532f6c31f79d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.952057 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25396a7c-449c-4891-9f05-80acf9ef5309" (UID: "25396a7c-449c-4891-9f05-80acf9ef5309"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.963070 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-w24fx" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="registry-server" probeResult="failure" output=< Oct 02 18:42:05 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:42:05 crc kubenswrapper[4909]: > Oct 02 18:42:05 crc kubenswrapper[4909]: I1002 18:42:05.973100 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20c64b6e-d451-4962-9793-532f6c31f79d" (UID: "20c64b6e-d451-4962-9793-532f6c31f79d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.002423 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whg68\" (UniqueName: \"kubernetes.io/projected/25396a7c-449c-4891-9f05-80acf9ef5309-kube-api-access-whg68\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.002455 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.002464 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.002473 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9rhw\" (UniqueName: \"kubernetes.io/projected/20c64b6e-d451-4962-9793-532f6c31f79d-kube-api-access-x9rhw\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.002481 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.002489 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.065349 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data" (OuterVolumeSpecName: "config-data") pod "20c64b6e-d451-4962-9793-532f6c31f79d" (UID: "20c64b6e-d451-4962-9793-532f6c31f79d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.093148 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data" (OuterVolumeSpecName: "config-data") pod "25396a7c-449c-4891-9f05-80acf9ef5309" (UID: "25396a7c-449c-4891-9f05-80acf9ef5309"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.107278 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c64b6e-d451-4962-9793-532f6c31f79d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.107876 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25396a7c-449c-4891-9f05-80acf9ef5309-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.223466 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9f896796b-dt79t" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.223540 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9f896796b-dt79t" event={"ID":"25396a7c-449c-4891-9f05-80acf9ef5309","Type":"ContainerDied","Data":"628dd04662407a06e58c7208dc642af79fb1378b47c9aa185ebc8773c7b73065"} Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.223611 4909 scope.go:117] "RemoveContainer" containerID="21d99a2ba58de400adfed2c828e3916fd93dc568857260758f92538f615cb8c5" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.241150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerStarted","Data":"52d3a17b8128c4f37aa8403976e03a3ffe8ec8ac90ba72cf60845644fcfc43d2"} Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.254533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" event={"ID":"e3230517-79be-47ce-83ca-f423e167c5f1","Type":"ContainerStarted","Data":"776b21505c4ed56155dc4d3af85f15489de7333b08067fbc91be6f7cbf0da4f1"} Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.254575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" event={"ID":"e3230517-79be-47ce-83ca-f423e167c5f1","Type":"ContainerStarted","Data":"04e3cbdfa59cef665974464f1b89997be6a92c93bb7ca7054c8d40a4897da1bb"} Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.276771 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.283703 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" event={"ID":"fbb7a8f4-dd8d-4940-a84e-28364cb8c285","Type":"ContainerStarted","Data":"bfd9a14a1e83428e9662597c47b9b0e06c230cc77bd72f9aaf956efa8e5f0cbd"} Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.284685 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.300215 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" podStartSLOduration=3.300196918 podStartE2EDuration="3.300196918s" podCreationTimestamp="2025-10-02 18:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:06.281674174 +0000 UTC m=+1447.469170033" watchObservedRunningTime="2025-10-02 18:42:06.300196918 +0000 UTC m=+1447.487692777" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.302057 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.302612 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" event={"ID":"20c64b6e-d451-4962-9793-532f6c31f79d","Type":"ContainerDied","Data":"fd60f3311043b3a323e0fef1299ddd71126dd82431c79d1c5ce1a978122eb073"} Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.302709 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.362668 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" podStartSLOduration=5.362651315 podStartE2EDuration="5.362651315s" podCreationTimestamp="2025-10-02 18:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:06.30755614 +0000 UTC m=+1447.495051999" watchObservedRunningTime="2025-10-02 18:42:06.362651315 +0000 UTC m=+1447.550147174" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.375674 4909 scope.go:117] "RemoveContainer" containerID="a9c7e2dcc9cfc4a50071740d1912151f9e585eb62535a42a7a72094b1eadd883" Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.400646 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9f896796b-dt79t"] Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.407838 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-9f896796b-dt79t"] Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.416171 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58db674fb5-5s8s5"] Oct 02 18:42:06 crc kubenswrapper[4909]: I1002 18:42:06.428565 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58db674fb5-5s8s5"] Oct 02 18:42:07 crc kubenswrapper[4909]: I1002 18:42:07.634068 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c64b6e-d451-4962-9793-532f6c31f79d" path="/var/lib/kubelet/pods/20c64b6e-d451-4962-9793-532f6c31f79d/volumes" Oct 02 18:42:07 crc kubenswrapper[4909]: I1002 18:42:07.635453 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25396a7c-449c-4891-9f05-80acf9ef5309" path="/var/lib/kubelet/pods/25396a7c-449c-4891-9f05-80acf9ef5309/volumes" Oct 02 18:42:07 crc kubenswrapper[4909]: I1002 18:42:07.979100 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:42:07 crc kubenswrapper[4909]: I1002 18:42:07.979532 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:42:08 crc kubenswrapper[4909]: I1002 18:42:08.032618 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:42:12 crc kubenswrapper[4909]: I1002 18:42:12.457179 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:12 crc kubenswrapper[4909]: I1002 18:42:12.565553 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-d75nf"] Oct 02 18:42:12 crc kubenswrapper[4909]: I1002 18:42:12.565830 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" podUID="a4ec2d23-77a0-413f-a539-39098058c284" containerName="dnsmasq-dns" containerID="cri-o://7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f" gracePeriod=10 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.249070 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.275966 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-sb\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.276097 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-config\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.276146 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-nb\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.276378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-svc\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.276430 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942gv\" (UniqueName: \"kubernetes.io/projected/a4ec2d23-77a0-413f-a539-39098058c284-kube-api-access-942gv\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.276464 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.284583 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ec2d23-77a0-413f-a539-39098058c284-kube-api-access-942gv" (OuterVolumeSpecName: "kube-api-access-942gv") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "kube-api-access-942gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.360487 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.363305 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.382096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.384011 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0\") pod \"a4ec2d23-77a0-413f-a539-39098058c284\" (UID: \"a4ec2d23-77a0-413f-a539-39098058c284\") " Oct 02 18:42:13 crc kubenswrapper[4909]: W1002 18:42:13.384254 4909 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a4ec2d23-77a0-413f-a539-39098058c284/volumes/kubernetes.io~configmap/dns-swift-storage-0 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.384271 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.384531 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.384547 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.384556 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942gv\" (UniqueName: \"kubernetes.io/projected/a4ec2d23-77a0-413f-a539-39098058c284-kube-api-access-942gv\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.384566 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.408733 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.428345 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08c76dbf-4440-4ec1-98a3-14667d5c8a2a","Type":"ContainerStarted","Data":"4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.429963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51695048-17a4-40ae-bbfd-9dc1f27233fe","Type":"ContainerStarted","Data":"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.429985 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51695048-17a4-40ae-bbfd-9dc1f27233fe","Type":"ContainerStarted","Data":"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.430093 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-log" containerID="cri-o://62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c" gracePeriod=30 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.430210 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-metadata" containerID="cri-o://8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53" gracePeriod=30 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.430609 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-config" (OuterVolumeSpecName: "config") pod "a4ec2d23-77a0-413f-a539-39098058c284" (UID: "a4ec2d23-77a0-413f-a539-39098058c284"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.432881 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="08c76dbf-4440-4ec1-98a3-14667d5c8a2a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630" gracePeriod=30 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.434852 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d","Type":"ContainerStarted","Data":"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.434955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d","Type":"ContainerStarted","Data":"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.437711 4909 generic.go:334] "Generic (PLEG): container finished" podID="57665172-f334-4a30-b3d0-60b0a50183e9" containerID="2c9f9515b5d2294949cef5236454c43f368efcf326b1c5b3406e0771f524ec49" exitCode=0 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.437835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlfn6" event={"ID":"57665172-f334-4a30-b3d0-60b0a50183e9","Type":"ContainerDied","Data":"2c9f9515b5d2294949cef5236454c43f368efcf326b1c5b3406e0771f524ec49"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.441685 4909 generic.go:334] "Generic (PLEG): container finished" podID="a4ec2d23-77a0-413f-a539-39098058c284" containerID="7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f" exitCode=0 Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.441822 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" event={"ID":"a4ec2d23-77a0-413f-a539-39098058c284","Type":"ContainerDied","Data":"7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.441907 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" event={"ID":"a4ec2d23-77a0-413f-a539-39098058c284","Type":"ContainerDied","Data":"1e3ed82e93d7adee4f815c3121ec6105b886bfe7a44928177d0c52c3e3b2dc5a"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.441981 4909 scope.go:117] "RemoveContainer" containerID="7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.442189 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-d75nf" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.453457 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-425k4" event={"ID":"63441e27-e4a8-4db3-9c60-5cd65f0e136a","Type":"ContainerStarted","Data":"7b3f29e1867c04993e0899074e0472781c29e2189fea3c4f84d4edf49c5b8948"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.458270 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.04548604 podStartE2EDuration="12.458255225s" podCreationTimestamp="2025-10-02 18:42:01 +0000 UTC" firstStartedPulling="2025-10-02 18:42:03.536493741 +0000 UTC m=+1444.723989590" lastFinishedPulling="2025-10-02 18:42:11.949262916 +0000 UTC m=+1453.136758775" observedRunningTime="2025-10-02 18:42:13.446119032 +0000 UTC m=+1454.633614891" watchObservedRunningTime="2025-10-02 18:42:13.458255225 +0000 UTC m=+1454.645751084" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.462625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf","Type":"ContainerStarted","Data":"52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff"} Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.481541 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.089062263 podStartE2EDuration="12.481525898s" podCreationTimestamp="2025-10-02 18:42:01 +0000 UTC" firstStartedPulling="2025-10-02 18:42:03.55708645 +0000 UTC m=+1444.744582309" lastFinishedPulling="2025-10-02 18:42:11.949550045 +0000 UTC m=+1453.137045944" observedRunningTime="2025-10-02 18:42:13.462438627 +0000 UTC m=+1454.649934486" watchObservedRunningTime="2025-10-02 18:42:13.481525898 +0000 UTC m=+1454.669021757" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.486385 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.486576 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ec2d23-77a0-413f-a539-39098058c284-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.496634 4909 scope.go:117] "RemoveContainer" containerID="a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.506971 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.370703705 podStartE2EDuration="12.506950249s" podCreationTimestamp="2025-10-02 18:42:01 +0000 UTC" firstStartedPulling="2025-10-02 18:42:03.813296601 +0000 UTC m=+1445.000792460" lastFinishedPulling="2025-10-02 18:42:11.949543125 +0000 UTC m=+1453.137039004" observedRunningTime="2025-10-02 18:42:13.499809814 +0000 UTC m=+1454.687305693" watchObservedRunningTime="2025-10-02 18:42:13.506950249 +0000 UTC m=+1454.694446108" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.520527 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-425k4" podStartSLOduration=3.2761111 podStartE2EDuration="10.520508997s" podCreationTimestamp="2025-10-02 18:42:03 +0000 UTC" firstStartedPulling="2025-10-02 18:42:04.832108318 +0000 UTC m=+1446.019604177" lastFinishedPulling="2025-10-02 18:42:12.076506215 +0000 UTC m=+1453.264002074" observedRunningTime="2025-10-02 18:42:13.516046526 +0000 UTC m=+1454.703542385" watchObservedRunningTime="2025-10-02 18:42:13.520508997 +0000 UTC m=+1454.708004856" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.534857 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.619093212 podStartE2EDuration="12.534840738s" podCreationTimestamp="2025-10-02 18:42:01 +0000 UTC" firstStartedPulling="2025-10-02 18:42:04.03353241 +0000 UTC m=+1445.221028269" lastFinishedPulling="2025-10-02 18:42:11.949279916 +0000 UTC m=+1453.136775795" observedRunningTime="2025-10-02 18:42:13.530715788 +0000 UTC m=+1454.718211647" watchObservedRunningTime="2025-10-02 18:42:13.534840738 +0000 UTC m=+1454.722336597" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.557205 4909 scope.go:117] "RemoveContainer" containerID="7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.557276 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-d75nf"] Oct 02 18:42:13 crc kubenswrapper[4909]: E1002 18:42:13.559241 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f\": container with ID starting with 7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f not found: ID does not exist" containerID="7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.559287 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f"} err="failed to get container status \"7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f\": rpc error: code = NotFound desc = could not find container \"7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f\": container with ID starting with 7dfb18ddf6539a0350389f929625bff8fb76ef27c25b1c5ab81777656bcf385f not found: ID does not exist" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.559317 4909 scope.go:117] "RemoveContainer" containerID="a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974" Oct 02 18:42:13 crc kubenswrapper[4909]: E1002 18:42:13.561074 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974\": container with ID starting with a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974 not found: ID does not exist" containerID="a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.561102 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974"} err="failed to get container status \"a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974\": rpc error: code = NotFound desc = could not find container \"a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974\": container with ID starting with a2cb5929fc78f7940e0fe69afb7dcaaec64e87a5715c89e27b1c9c377a4d2974 not found: ID does not exist" Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.569395 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-d75nf"] Oct 02 18:42:13 crc kubenswrapper[4909]: I1002 18:42:13.626044 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ec2d23-77a0-413f-a539-39098058c284" path="/var/lib/kubelet/pods/a4ec2d23-77a0-413f-a539-39098058c284/volumes" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.192475 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.302091 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51695048-17a4-40ae-bbfd-9dc1f27233fe-logs\") pod \"51695048-17a4-40ae-bbfd-9dc1f27233fe\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.302164 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wqtx\" (UniqueName: \"kubernetes.io/projected/51695048-17a4-40ae-bbfd-9dc1f27233fe-kube-api-access-4wqtx\") pod \"51695048-17a4-40ae-bbfd-9dc1f27233fe\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.302209 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-config-data\") pod \"51695048-17a4-40ae-bbfd-9dc1f27233fe\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.302336 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-combined-ca-bundle\") pod \"51695048-17a4-40ae-bbfd-9dc1f27233fe\" (UID: \"51695048-17a4-40ae-bbfd-9dc1f27233fe\") " Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.303280 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51695048-17a4-40ae-bbfd-9dc1f27233fe-logs" (OuterVolumeSpecName: "logs") pod "51695048-17a4-40ae-bbfd-9dc1f27233fe" (UID: "51695048-17a4-40ae-bbfd-9dc1f27233fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.326262 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51695048-17a4-40ae-bbfd-9dc1f27233fe-kube-api-access-4wqtx" (OuterVolumeSpecName: "kube-api-access-4wqtx") pod "51695048-17a4-40ae-bbfd-9dc1f27233fe" (UID: "51695048-17a4-40ae-bbfd-9dc1f27233fe"). InnerVolumeSpecName "kube-api-access-4wqtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.339966 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-config-data" (OuterVolumeSpecName: "config-data") pod "51695048-17a4-40ae-bbfd-9dc1f27233fe" (UID: "51695048-17a4-40ae-bbfd-9dc1f27233fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.352229 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51695048-17a4-40ae-bbfd-9dc1f27233fe" (UID: "51695048-17a4-40ae-bbfd-9dc1f27233fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.405005 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51695048-17a4-40ae-bbfd-9dc1f27233fe-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.405153 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wqtx\" (UniqueName: \"kubernetes.io/projected/51695048-17a4-40ae-bbfd-9dc1f27233fe-kube-api-access-4wqtx\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.405169 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.405183 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51695048-17a4-40ae-bbfd-9dc1f27233fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.475810 4909 generic.go:334] "Generic (PLEG): container finished" podID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerID="8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53" exitCode=0 Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.476693 4909 generic.go:334] "Generic (PLEG): container finished" podID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerID="62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c" exitCode=143 Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.475894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51695048-17a4-40ae-bbfd-9dc1f27233fe","Type":"ContainerDied","Data":"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53"} Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.475874 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.476874 4909 scope.go:117] "RemoveContainer" containerID="8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.476859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51695048-17a4-40ae-bbfd-9dc1f27233fe","Type":"ContainerDied","Data":"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c"} Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.477069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"51695048-17a4-40ae-bbfd-9dc1f27233fe","Type":"ContainerDied","Data":"affcaa742e51fa25cfb00bcc0a9ea10b01f74e612ab3ce794001a487af9d654b"} Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.520199 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.538312 4909 scope.go:117] "RemoveContainer" containerID="62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.544844 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.556983 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.557602 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-log" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.557662 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-log" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.557738 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-metadata" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.557787 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-metadata" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.557852 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25396a7c-449c-4891-9f05-80acf9ef5309" containerName="heat-api" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.557898 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="25396a7c-449c-4891-9f05-80acf9ef5309" containerName="heat-api" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.557947 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ec2d23-77a0-413f-a539-39098058c284" containerName="init" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.557994 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ec2d23-77a0-413f-a539-39098058c284" containerName="init" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.558057 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ec2d23-77a0-413f-a539-39098058c284" containerName="dnsmasq-dns" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558113 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ec2d23-77a0-413f-a539-39098058c284" containerName="dnsmasq-dns" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.558187 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c64b6e-d451-4962-9793-532f6c31f79d" containerName="heat-cfnapi" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558233 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c64b6e-d451-4962-9793-532f6c31f79d" containerName="heat-cfnapi" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558465 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-log" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558528 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c64b6e-d451-4962-9793-532f6c31f79d" containerName="heat-cfnapi" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558580 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ec2d23-77a0-413f-a539-39098058c284" containerName="dnsmasq-dns" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558636 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" containerName="nova-metadata-metadata" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.558696 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="25396a7c-449c-4891-9f05-80acf9ef5309" containerName="heat-api" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.559962 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.565324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.596089 4909 scope.go:117] "RemoveContainer" containerID="8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.597220 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53\": container with ID starting with 8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53 not found: ID does not exist" containerID="8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.597264 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53"} err="failed to get container status \"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53\": rpc error: code = NotFound desc = could not find container \"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53\": container with ID starting with 8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53 not found: ID does not exist" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.597293 4909 scope.go:117] "RemoveContainer" containerID="62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c" Oct 02 18:42:14 crc kubenswrapper[4909]: E1002 18:42:14.597990 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c\": container with ID starting with 62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c not found: ID does not exist" containerID="62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.598126 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c"} err="failed to get container status \"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c\": rpc error: code = NotFound desc = could not find container \"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c\": container with ID starting with 62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c not found: ID does not exist" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.598205 4909 scope.go:117] "RemoveContainer" containerID="8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.600126 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53"} err="failed to get container status \"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53\": rpc error: code = NotFound desc = could not find container \"8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53\": container with ID starting with 8e63c2cee72c8dc869161cc3f899ac5ecd662e27d2065823efc5403053fb8b53 not found: ID does not exist" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.600211 4909 scope.go:117] "RemoveContainer" containerID="62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.601198 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c"} err="failed to get container status \"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c\": rpc error: code = NotFound desc = could not find container \"62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c\": container with ID starting with 62a026dff371fcf32a6b41d9594e0b6048524e6951eb45e81c7c1229858a2d9c not found: ID does not exist" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.609537 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297b3d1-d440-4492-9b15-44f57e246635-logs\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.609606 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshvt\" (UniqueName: \"kubernetes.io/projected/d297b3d1-d440-4492-9b15-44f57e246635-kube-api-access-wshvt\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.609681 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-config-data\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.609727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.610041 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.614950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.616439 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.714674 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297b3d1-d440-4492-9b15-44f57e246635-logs\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.714732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshvt\" (UniqueName: \"kubernetes.io/projected/d297b3d1-d440-4492-9b15-44f57e246635-kube-api-access-wshvt\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.714791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-config-data\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.714832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.714914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.727768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297b3d1-d440-4492-9b15-44f57e246635-logs\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.733452 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-config-data\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.735463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.756862 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.758682 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshvt\" (UniqueName: \"kubernetes.io/projected/d297b3d1-d440-4492-9b15-44f57e246635-kube-api-access-wshvt\") pod \"nova-metadata-0\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.841281 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.900707 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.911654 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:14 crc kubenswrapper[4909]: I1002 18:42:14.959469 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.097455 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w24fx"] Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.127004 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-scripts\") pod \"57665172-f334-4a30-b3d0-60b0a50183e9\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.127433 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-combined-ca-bundle\") pod \"57665172-f334-4a30-b3d0-60b0a50183e9\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.127505 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z54pw\" (UniqueName: \"kubernetes.io/projected/57665172-f334-4a30-b3d0-60b0a50183e9-kube-api-access-z54pw\") pod \"57665172-f334-4a30-b3d0-60b0a50183e9\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.128223 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-config-data\") pod \"57665172-f334-4a30-b3d0-60b0a50183e9\" (UID: \"57665172-f334-4a30-b3d0-60b0a50183e9\") " Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.132245 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-scripts" (OuterVolumeSpecName: "scripts") pod "57665172-f334-4a30-b3d0-60b0a50183e9" (UID: "57665172-f334-4a30-b3d0-60b0a50183e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.134906 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57665172-f334-4a30-b3d0-60b0a50183e9-kube-api-access-z54pw" (OuterVolumeSpecName: "kube-api-access-z54pw") pod "57665172-f334-4a30-b3d0-60b0a50183e9" (UID: "57665172-f334-4a30-b3d0-60b0a50183e9"). InnerVolumeSpecName "kube-api-access-z54pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.167332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-config-data" (OuterVolumeSpecName: "config-data") pod "57665172-f334-4a30-b3d0-60b0a50183e9" (UID: "57665172-f334-4a30-b3d0-60b0a50183e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.170992 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57665172-f334-4a30-b3d0-60b0a50183e9" (UID: "57665172-f334-4a30-b3d0-60b0a50183e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.230137 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.230166 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z54pw\" (UniqueName: \"kubernetes.io/projected/57665172-f334-4a30-b3d0-60b0a50183e9-kube-api-access-z54pw\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.230180 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.230188 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57665172-f334-4a30-b3d0-60b0a50183e9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.397137 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:15 crc kubenswrapper[4909]: W1002 18:42:15.399224 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd297b3d1_d440_4492_9b15_44f57e246635.slice/crio-a790a68a95e825f7e90a4e51068247bd61924cf267a89efa95e99648c37d75e7 WatchSource:0}: Error finding container a790a68a95e825f7e90a4e51068247bd61924cf267a89efa95e99648c37d75e7: Status 404 returned error can't find the container with id a790a68a95e825f7e90a4e51068247bd61924cf267a89efa95e99648c37d75e7 Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.502988 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d297b3d1-d440-4492-9b15-44f57e246635","Type":"ContainerStarted","Data":"a790a68a95e825f7e90a4e51068247bd61924cf267a89efa95e99648c37d75e7"} Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.507840 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlfn6" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.508195 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlfn6" event={"ID":"57665172-f334-4a30-b3d0-60b0a50183e9","Type":"ContainerDied","Data":"ee8aee27f989682a55fb0be59df904257b57403b6b91cb8f4c5e46a9d1eafdae"} Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.508249 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee8aee27f989682a55fb0be59df904257b57403b6b91cb8f4c5e46a9d1eafdae" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.625828 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51695048-17a4-40ae-bbfd-9dc1f27233fe" path="/var/lib/kubelet/pods/51695048-17a4-40ae-bbfd-9dc1f27233fe/volumes" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.685127 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58db674fb5-5s8s5" podUID="20c64b6e-d451-4962-9793-532f6c31f79d" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.200:8000/healthcheck\": dial tcp 10.217.0.200:8000: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.697398 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.697579 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" containerName="nova-scheduler-scheduler" containerID="cri-o://52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff" gracePeriod=30 Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.723973 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-9f896796b-dt79t" podUID="25396a7c-449c-4891-9f05-80acf9ef5309" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.201:8004/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.735399 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.750071 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.750301 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-log" containerID="cri-o://214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee" gracePeriod=30 Oct 02 18:42:15 crc kubenswrapper[4909]: I1002 18:42:15.750430 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-api" containerID="cri-o://a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a" gracePeriod=30 Oct 02 18:42:16 crc kubenswrapper[4909]: E1002 18:42:16.066977 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c2caa97_1bbd_4c64_9d1a_f03164c9ae2d.slice/crio-conmon-a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c2caa97_1bbd_4c64_9d1a_f03164c9ae2d.slice/crio-a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a.scope\": RecentStats: unable to find data in memory cache]" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.369895 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.520448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d297b3d1-d440-4492-9b15-44f57e246635","Type":"ContainerStarted","Data":"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896"} Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.520783 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d297b3d1-d440-4492-9b15-44f57e246635","Type":"ContainerStarted","Data":"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4"} Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.520550 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-log" containerID="cri-o://dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4" gracePeriod=30 Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.520992 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-metadata" containerID="cri-o://30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896" gracePeriod=30 Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.524867 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerID="a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a" exitCode=0 Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.524902 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerID="214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee" exitCode=143 Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.525079 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.525104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d","Type":"ContainerDied","Data":"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a"} Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.525262 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d","Type":"ContainerDied","Data":"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee"} Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.525281 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d","Type":"ContainerDied","Data":"9fc53444c46e0c729f22dd938df6b949cf19ec7d6fddc2702af1bb0ce0088684"} Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.525300 4909 scope.go:117] "RemoveContainer" containerID="a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.525143 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w24fx" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="registry-server" containerID="cri-o://c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5" gracePeriod=2 Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.543379 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.543362519 podStartE2EDuration="2.543362519s" podCreationTimestamp="2025-10-02 18:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:16.538124843 +0000 UTC m=+1457.725620712" watchObservedRunningTime="2025-10-02 18:42:16.543362519 +0000 UTC m=+1457.730858378" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.556437 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkhvb\" (UniqueName: \"kubernetes.io/projected/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-kube-api-access-lkhvb\") pod \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.556618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-combined-ca-bundle\") pod \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.556794 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-logs\") pod \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.556814 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-config-data\") pod \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\" (UID: \"5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d\") " Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.562735 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-logs" (OuterVolumeSpecName: "logs") pod "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" (UID: "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.565250 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-kube-api-access-lkhvb" (OuterVolumeSpecName: "kube-api-access-lkhvb") pod "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" (UID: "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d"). InnerVolumeSpecName "kube-api-access-lkhvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.602213 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" (UID: "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.609379 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-config-data" (OuterVolumeSpecName: "config-data") pod "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" (UID: "5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.659003 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkhvb\" (UniqueName: \"kubernetes.io/projected/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-kube-api-access-lkhvb\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.659350 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.659364 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.659375 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.955356 4909 scope.go:117] "RemoveContainer" containerID="214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee" Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.961060 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:16 crc kubenswrapper[4909]: I1002 18:42:16.976328 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.064581 4909 scope.go:117] "RemoveContainer" containerID="a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.070867 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.071915 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a\": container with ID starting with a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a not found: ID does not exist" containerID="a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.071972 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a"} err="failed to get container status \"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a\": rpc error: code = NotFound desc = could not find container \"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a\": container with ID starting with a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.072000 4909 scope.go:117] "RemoveContainer" containerID="214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.072644 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee\": container with ID starting with 214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee not found: ID does not exist" containerID="214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.072681 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee"} err="failed to get container status \"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee\": rpc error: code = NotFound desc = could not find container \"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee\": container with ID starting with 214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.072708 4909 scope.go:117] "RemoveContainer" containerID="a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.074305 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a"} err="failed to get container status \"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a\": rpc error: code = NotFound desc = could not find container \"a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a\": container with ID starting with a0653b6ecbbeff78b5bf5f5b181ac6992cff7b5e44b5511556f26df91017052a not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.074330 4909 scope.go:117] "RemoveContainer" containerID="214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.075814 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee"} err="failed to get container status \"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee\": rpc error: code = NotFound desc = could not find container \"214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee\": container with ID starting with 214027b345efc53f74fba2db90ac645c344c76c2ae4489a121630aebc778b3ee not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.090164 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.090635 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-log" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.090657 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-log" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.090705 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57665172-f334-4a30-b3d0-60b0a50183e9" containerName="nova-manage" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.090712 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57665172-f334-4a30-b3d0-60b0a50183e9" containerName="nova-manage" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.090722 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-api" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.090728 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-api" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.090990 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-log" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.091016 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57665172-f334-4a30-b3d0-60b0a50183e9" containerName="nova-manage" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.091050 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" containerName="nova-api-api" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.092512 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.094292 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.113393 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.144921 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.216395 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-logs\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.216452 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.216479 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-config-data\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.216497 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wks8g\" (UniqueName: \"kubernetes.io/projected/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-kube-api-access-wks8g\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.234973 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.268888 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.317800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzlwc\" (UniqueName: \"kubernetes.io/projected/7d99b60c-a7c5-40be-97a9-e424214bd40e-kube-api-access-dzlwc\") pod \"7d99b60c-a7c5-40be-97a9-e424214bd40e\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.317897 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-catalog-content\") pod \"7d99b60c-a7c5-40be-97a9-e424214bd40e\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.317926 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-utilities\") pod \"7d99b60c-a7c5-40be-97a9-e424214bd40e\" (UID: \"7d99b60c-a7c5-40be-97a9-e424214bd40e\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.318231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.318265 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-config-data\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.318283 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wks8g\" (UniqueName: \"kubernetes.io/projected/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-kube-api-access-wks8g\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.318422 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-logs\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.318791 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-logs\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.320276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-utilities" (OuterVolumeSpecName: "utilities") pod "7d99b60c-a7c5-40be-97a9-e424214bd40e" (UID: "7d99b60c-a7c5-40be-97a9-e424214bd40e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.327198 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d99b60c-a7c5-40be-97a9-e424214bd40e-kube-api-access-dzlwc" (OuterVolumeSpecName: "kube-api-access-dzlwc") pod "7d99b60c-a7c5-40be-97a9-e424214bd40e" (UID: "7d99b60c-a7c5-40be-97a9-e424214bd40e"). InnerVolumeSpecName "kube-api-access-dzlwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.329879 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.337187 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-config-data\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.341813 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wks8g\" (UniqueName: \"kubernetes.io/projected/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-kube-api-access-wks8g\") pod \"nova-api-0\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.359036 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.364168 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.382929 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d99b60c-a7c5-40be-97a9-e424214bd40e" (UID: "7d99b60c-a7c5-40be-97a9-e424214bd40e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419021 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556gr\" (UniqueName: \"kubernetes.io/projected/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-kube-api-access-556gr\") pod \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419400 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-config-data\") pod \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshvt\" (UniqueName: \"kubernetes.io/projected/d297b3d1-d440-4492-9b15-44f57e246635-kube-api-access-wshvt\") pod \"d297b3d1-d440-4492-9b15-44f57e246635\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-nova-metadata-tls-certs\") pod \"d297b3d1-d440-4492-9b15-44f57e246635\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419571 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-combined-ca-bundle\") pod \"d297b3d1-d440-4492-9b15-44f57e246635\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419609 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-config-data\") pod \"d297b3d1-d440-4492-9b15-44f57e246635\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419656 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297b3d1-d440-4492-9b15-44f57e246635-logs\") pod \"d297b3d1-d440-4492-9b15-44f57e246635\" (UID: \"d297b3d1-d440-4492-9b15-44f57e246635\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.419694 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-combined-ca-bundle\") pod \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\" (UID: \"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf\") " Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.420109 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.420127 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzlwc\" (UniqueName: \"kubernetes.io/projected/7d99b60c-a7c5-40be-97a9-e424214bd40e-kube-api-access-dzlwc\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.420136 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d99b60c-a7c5-40be-97a9-e424214bd40e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.422861 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d297b3d1-d440-4492-9b15-44f57e246635-logs" (OuterVolumeSpecName: "logs") pod "d297b3d1-d440-4492-9b15-44f57e246635" (UID: "d297b3d1-d440-4492-9b15-44f57e246635"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.423077 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-kube-api-access-556gr" (OuterVolumeSpecName: "kube-api-access-556gr") pod "6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" (UID: "6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf"). InnerVolumeSpecName "kube-api-access-556gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.427551 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d297b3d1-d440-4492-9b15-44f57e246635-kube-api-access-wshvt" (OuterVolumeSpecName: "kube-api-access-wshvt") pod "d297b3d1-d440-4492-9b15-44f57e246635" (UID: "d297b3d1-d440-4492-9b15-44f57e246635"). InnerVolumeSpecName "kube-api-access-wshvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.452333 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" (UID: "6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.455285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-config-data" (OuterVolumeSpecName: "config-data") pod "d297b3d1-d440-4492-9b15-44f57e246635" (UID: "d297b3d1-d440-4492-9b15-44f57e246635"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.475516 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-config-data" (OuterVolumeSpecName: "config-data") pod "6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" (UID: "6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.478849 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d297b3d1-d440-4492-9b15-44f57e246635" (UID: "d297b3d1-d440-4492-9b15-44f57e246635"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.504468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d297b3d1-d440-4492-9b15-44f57e246635" (UID: "d297b3d1-d440-4492-9b15-44f57e246635"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521826 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521854 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-556gr\" (UniqueName: \"kubernetes.io/projected/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-kube-api-access-556gr\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521865 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521874 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshvt\" (UniqueName: \"kubernetes.io/projected/d297b3d1-d440-4492-9b15-44f57e246635-kube-api-access-wshvt\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521884 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521894 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521903 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297b3d1-d440-4492-9b15-44f57e246635-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.521910 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297b3d1-d440-4492-9b15-44f57e246635-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.525979 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.550174 4909 generic.go:334] "Generic (PLEG): container finished" podID="6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" containerID="52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff" exitCode=0 Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.550244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf","Type":"ContainerDied","Data":"52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.550275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf","Type":"ContainerDied","Data":"5e335229afc5bd687f7abc6d90326990acb1a0d3e8ded5f892991b85a8b005d8"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.550292 4909 scope.go:117] "RemoveContainer" containerID="52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.550397 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.558389 4909 generic.go:334] "Generic (PLEG): container finished" podID="d297b3d1-d440-4492-9b15-44f57e246635" containerID="30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896" exitCode=0 Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.558432 4909 generic.go:334] "Generic (PLEG): container finished" podID="d297b3d1-d440-4492-9b15-44f57e246635" containerID="dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4" exitCode=143 Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.558550 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.558759 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d297b3d1-d440-4492-9b15-44f57e246635","Type":"ContainerDied","Data":"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.558810 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d297b3d1-d440-4492-9b15-44f57e246635","Type":"ContainerDied","Data":"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.558822 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d297b3d1-d440-4492-9b15-44f57e246635","Type":"ContainerDied","Data":"a790a68a95e825f7e90a4e51068247bd61924cf267a89efa95e99648c37d75e7"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.565318 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerID="c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5" exitCode=0 Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.565371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerDied","Data":"c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.565401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w24fx" event={"ID":"7d99b60c-a7c5-40be-97a9-e424214bd40e","Type":"ContainerDied","Data":"acc886f264b233fe4a79530bd4cb3f3c2927f63c6c58e10a3b6aaabb0c4d58c4"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.565422 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w24fx" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.579486 4909 generic.go:334] "Generic (PLEG): container finished" podID="63441e27-e4a8-4db3-9c60-5cd65f0e136a" containerID="7b3f29e1867c04993e0899074e0472781c29e2189fea3c4f84d4edf49c5b8948" exitCode=0 Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.579540 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-425k4" event={"ID":"63441e27-e4a8-4db3-9c60-5cd65f0e136a","Type":"ContainerDied","Data":"7b3f29e1867c04993e0899074e0472781c29e2189fea3c4f84d4edf49c5b8948"} Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.589548 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.594972 4909 scope.go:117] "RemoveContainer" containerID="52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.595778 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff\": container with ID starting with 52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff not found: ID does not exist" containerID="52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.596071 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff"} err="failed to get container status \"52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff\": rpc error: code = NotFound desc = could not find container \"52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff\": container with ID starting with 52273c9e2373b4ecc3c6a64dfb966f186c5c5ccce370ae0caa08eae425fef6ff not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.596212 4909 scope.go:117] "RemoveContainer" containerID="30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.602476 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.651540 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d" path="/var/lib/kubelet/pods/5c2caa97-1bbd-4c64-9d1a-f03164c9ae2d/volumes" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.657472 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" path="/var/lib/kubelet/pods/6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf/volumes" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.658209 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.658585 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" containerName="nova-scheduler-scheduler" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.658599 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" containerName="nova-scheduler-scheduler" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.658617 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="extract-content" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.658624 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="extract-content" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.658635 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-metadata" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.658642 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-metadata" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.658662 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="extract-utilities" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.658669 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="extract-utilities" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.664324 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="registry-server" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.664367 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="registry-server" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.664446 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-log" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.664456 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-log" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.666146 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-metadata" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.666381 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebf8564-7db8-47cc-ba5d-dd10aebd5dcf" containerName="nova-scheduler-scheduler" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.666410 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d297b3d1-d440-4492-9b15-44f57e246635" containerName="nova-metadata-log" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.666433 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" containerName="registry-server" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.667387 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.672728 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.698095 4909 scope.go:117] "RemoveContainer" containerID="dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.715172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.724444 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.729133 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-config-data\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.729383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.729463 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hkc2\" (UniqueName: \"kubernetes.io/projected/4a40adda-36e4-4145-a620-50792614e9e7-kube-api-access-7hkc2\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.732189 4909 scope.go:117] "RemoveContainer" containerID="30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.733286 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896\": container with ID starting with 30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896 not found: ID does not exist" containerID="30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.733342 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896"} err="failed to get container status \"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896\": rpc error: code = NotFound desc = could not find container \"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896\": container with ID starting with 30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.733376 4909 scope.go:117] "RemoveContainer" containerID="dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.733507 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.734202 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4\": container with ID starting with dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4 not found: ID does not exist" containerID="dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.734329 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4"} err="failed to get container status \"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4\": rpc error: code = NotFound desc = could not find container \"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4\": container with ID starting with dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.734364 4909 scope.go:117] "RemoveContainer" containerID="30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.741054 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896"} err="failed to get container status \"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896\": rpc error: code = NotFound desc = could not find container \"30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896\": container with ID starting with 30c27c2e829e001f8821c9ece787e644c4c9cc156c08b4e62ed12171eef3c896 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.741115 4909 scope.go:117] "RemoveContainer" containerID="dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.741461 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4"} err="failed to get container status \"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4\": rpc error: code = NotFound desc = could not find container \"dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4\": container with ID starting with dc5751f8640a6dfa159884eb67ecd0e99f1917d3fbedb1b96dd3d38cbf8f33b4 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.741984 4909 scope.go:117] "RemoveContainer" containerID="c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.744599 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w24fx"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.753568 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w24fx"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.761373 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.763353 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.765787 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.765914 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.771682 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.806492 4909 scope.go:117] "RemoveContainer" containerID="e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831316 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-logs\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xll\" (UniqueName: \"kubernetes.io/projected/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-kube-api-access-h6xll\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831417 4909 scope.go:117] "RemoveContainer" containerID="9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831459 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-config-data\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.831854 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.832082 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hkc2\" (UniqueName: \"kubernetes.io/projected/4a40adda-36e4-4145-a620-50792614e9e7-kube-api-access-7hkc2\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.832495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-config-data\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.838014 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-config-data\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.841720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.851199 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hkc2\" (UniqueName: \"kubernetes.io/projected/4a40adda-36e4-4145-a620-50792614e9e7-kube-api-access-7hkc2\") pod \"nova-scheduler-0\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " pod="openstack/nova-scheduler-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.860993 4909 scope.go:117] "RemoveContainer" containerID="c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.861567 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5\": container with ID starting with c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5 not found: ID does not exist" containerID="c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.861605 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5"} err="failed to get container status \"c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5\": rpc error: code = NotFound desc = could not find container \"c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5\": container with ID starting with c8c7076543bf3bad48dd60a239651354b1d023f51a1513b1934cb3dd1d5785f5 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.861630 4909 scope.go:117] "RemoveContainer" containerID="e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.861991 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5\": container with ID starting with e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5 not found: ID does not exist" containerID="e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.862060 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5"} err="failed to get container status \"e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5\": rpc error: code = NotFound desc = could not find container \"e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5\": container with ID starting with e0233af365ea970338d5be6dc61a739e9c7a91995b916b3f67a75ed1dce0eba5 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.862109 4909 scope.go:117] "RemoveContainer" containerID="9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451" Oct 02 18:42:17 crc kubenswrapper[4909]: E1002 18:42:17.862354 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451\": container with ID starting with 9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451 not found: ID does not exist" containerID="9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.862373 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451"} err="failed to get container status \"9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451\": rpc error: code = NotFound desc = could not find container \"9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451\": container with ID starting with 9ee65ef4d04f48badd5b29e5ab0bfcbc21f8170629810c140f45b3e60a99b451 not found: ID does not exist" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.935376 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-logs\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.935445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xll\" (UniqueName: \"kubernetes.io/projected/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-kube-api-access-h6xll\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.935469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.935500 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-config-data\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.935544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.936168 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-logs\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.939824 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.939962 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.950858 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-config-data\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:17 crc kubenswrapper[4909]: I1002 18:42:17.954660 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xll\" (UniqueName: \"kubernetes.io/projected/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-kube-api-access-h6xll\") pod \"nova-metadata-0\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " pod="openstack/nova-metadata-0" Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.010407 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.059900 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.062814 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.087688 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.581210 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:42:18 crc kubenswrapper[4909]: W1002 18:42:18.587528 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a40adda_36e4_4145_a620_50792614e9e7.slice/crio-3096a0385f49c7e9fc663131f36618d56417bcc703062dd787478bd74d815b0e WatchSource:0}: Error finding container 3096a0385f49c7e9fc663131f36618d56417bcc703062dd787478bd74d815b0e: Status 404 returned error can't find the container with id 3096a0385f49c7e9fc663131f36618d56417bcc703062dd787478bd74d815b0e Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.619171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dde70b73-df57-4cdd-b3ab-ddbf98a80f65","Type":"ContainerStarted","Data":"74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7"} Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.619223 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dde70b73-df57-4cdd-b3ab-ddbf98a80f65","Type":"ContainerStarted","Data":"285b1b2db35e24f4be64e8b09a0610286e028d86b8d1722b1f83d676726014fc"} Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.620940 4909 generic.go:334] "Generic (PLEG): container finished" podID="e3230517-79be-47ce-83ca-f423e167c5f1" containerID="776b21505c4ed56155dc4d3af85f15489de7333b08067fbc91be6f7cbf0da4f1" exitCode=0 Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.621188 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" event={"ID":"e3230517-79be-47ce-83ca-f423e167c5f1","Type":"ContainerDied","Data":"776b21505c4ed56155dc4d3af85f15489de7333b08067fbc91be6f7cbf0da4f1"} Oct 02 18:42:18 crc kubenswrapper[4909]: I1002 18:42:18.757777 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:42:18 crc kubenswrapper[4909]: W1002 18:42:18.763230 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0aaf37_f800_46eb_a8d1_9cf16385ec5d.slice/crio-3d361289bfa64444327348f81d7a067be7ec8fdeea0e102e126e92bde2455dd2 WatchSource:0}: Error finding container 3d361289bfa64444327348f81d7a067be7ec8fdeea0e102e126e92bde2455dd2: Status 404 returned error can't find the container with id 3d361289bfa64444327348f81d7a067be7ec8fdeea0e102e126e92bde2455dd2 Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.121649 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.279804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-combined-ca-bundle\") pod \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.279846 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjlw\" (UniqueName: \"kubernetes.io/projected/63441e27-e4a8-4db3-9c60-5cd65f0e136a-kube-api-access-tcjlw\") pod \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.279918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-scripts\") pod \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.279968 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-config-data\") pod \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\" (UID: \"63441e27-e4a8-4db3-9c60-5cd65f0e136a\") " Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.284917 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-scripts" (OuterVolumeSpecName: "scripts") pod "63441e27-e4a8-4db3-9c60-5cd65f0e136a" (UID: "63441e27-e4a8-4db3-9c60-5cd65f0e136a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.285374 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63441e27-e4a8-4db3-9c60-5cd65f0e136a-kube-api-access-tcjlw" (OuterVolumeSpecName: "kube-api-access-tcjlw") pod "63441e27-e4a8-4db3-9c60-5cd65f0e136a" (UID: "63441e27-e4a8-4db3-9c60-5cd65f0e136a"). InnerVolumeSpecName "kube-api-access-tcjlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.318287 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63441e27-e4a8-4db3-9c60-5cd65f0e136a" (UID: "63441e27-e4a8-4db3-9c60-5cd65f0e136a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.325393 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-config-data" (OuterVolumeSpecName: "config-data") pod "63441e27-e4a8-4db3-9c60-5cd65f0e136a" (UID: "63441e27-e4a8-4db3-9c60-5cd65f0e136a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.381968 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.382213 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjlw\" (UniqueName: \"kubernetes.io/projected/63441e27-e4a8-4db3-9c60-5cd65f0e136a-kube-api-access-tcjlw\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.382277 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.382329 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63441e27-e4a8-4db3-9c60-5cd65f0e136a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.624019 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d99b60c-a7c5-40be-97a9-e424214bd40e" path="/var/lib/kubelet/pods/7d99b60c-a7c5-40be-97a9-e424214bd40e/volumes" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.625434 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d297b3d1-d440-4492-9b15-44f57e246635" path="/var/lib/kubelet/pods/d297b3d1-d440-4492-9b15-44f57e246635/volumes" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.632226 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-425k4" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.632230 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-425k4" event={"ID":"63441e27-e4a8-4db3-9c60-5cd65f0e136a","Type":"ContainerDied","Data":"e09e6fee1bc22a1a621fc96405492f45405df32845f34305e787f66473fab947"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.632417 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e09e6fee1bc22a1a621fc96405492f45405df32845f34305e787f66473fab947" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.634054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a40adda-36e4-4145-a620-50792614e9e7","Type":"ContainerStarted","Data":"d578cad187e4f2ad5ed1dfa936c8c490625b602fbd1ef7dc9e39a63f912064fb"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.634152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a40adda-36e4-4145-a620-50792614e9e7","Type":"ContainerStarted","Data":"3096a0385f49c7e9fc663131f36618d56417bcc703062dd787478bd74d815b0e"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.640004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dde70b73-df57-4cdd-b3ab-ddbf98a80f65","Type":"ContainerStarted","Data":"847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.650320 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d","Type":"ContainerStarted","Data":"0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.650401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d","Type":"ContainerStarted","Data":"892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.650424 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d","Type":"ContainerStarted","Data":"3d361289bfa64444327348f81d7a067be7ec8fdeea0e102e126e92bde2455dd2"} Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.719395 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.719377415 podStartE2EDuration="3.719377415s" podCreationTimestamp="2025-10-02 18:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:19.715256605 +0000 UTC m=+1460.902752514" watchObservedRunningTime="2025-10-02 18:42:19.719377415 +0000 UTC m=+1460.906873294" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.738087 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.738067764 podStartE2EDuration="2.738067764s" podCreationTimestamp="2025-10-02 18:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:19.730372121 +0000 UTC m=+1460.917867980" watchObservedRunningTime="2025-10-02 18:42:19.738067764 +0000 UTC m=+1460.925563633" Oct 02 18:42:19 crc kubenswrapper[4909]: I1002 18:42:19.749457 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.749439852 podStartE2EDuration="2.749439852s" podCreationTimestamp="2025-10-02 18:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:19.74778295 +0000 UTC m=+1460.935278809" watchObservedRunningTime="2025-10-02 18:42:19.749439852 +0000 UTC m=+1460.936935711" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.094273 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.201213 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-scripts\") pod \"e3230517-79be-47ce-83ca-f423e167c5f1\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.201261 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qxnl\" (UniqueName: \"kubernetes.io/projected/e3230517-79be-47ce-83ca-f423e167c5f1-kube-api-access-4qxnl\") pod \"e3230517-79be-47ce-83ca-f423e167c5f1\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.201389 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-config-data\") pod \"e3230517-79be-47ce-83ca-f423e167c5f1\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.201508 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-combined-ca-bundle\") pod \"e3230517-79be-47ce-83ca-f423e167c5f1\" (UID: \"e3230517-79be-47ce-83ca-f423e167c5f1\") " Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.206697 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3230517-79be-47ce-83ca-f423e167c5f1-kube-api-access-4qxnl" (OuterVolumeSpecName: "kube-api-access-4qxnl") pod "e3230517-79be-47ce-83ca-f423e167c5f1" (UID: "e3230517-79be-47ce-83ca-f423e167c5f1"). InnerVolumeSpecName "kube-api-access-4qxnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.216183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-scripts" (OuterVolumeSpecName: "scripts") pod "e3230517-79be-47ce-83ca-f423e167c5f1" (UID: "e3230517-79be-47ce-83ca-f423e167c5f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.243296 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3230517-79be-47ce-83ca-f423e167c5f1" (UID: "e3230517-79be-47ce-83ca-f423e167c5f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.243378 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-config-data" (OuterVolumeSpecName: "config-data") pod "e3230517-79be-47ce-83ca-f423e167c5f1" (UID: "e3230517-79be-47ce-83ca-f423e167c5f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.295561 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tb477"] Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.295791 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tb477" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="registry-server" containerID="cri-o://52d3a17b8128c4f37aa8403976e03a3ffe8ec8ac90ba72cf60845644fcfc43d2" gracePeriod=2 Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.303352 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.303381 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qxnl\" (UniqueName: \"kubernetes.io/projected/e3230517-79be-47ce-83ca-f423e167c5f1-kube-api-access-4qxnl\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.303393 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.303402 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3230517-79be-47ce-83ca-f423e167c5f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.692278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" event={"ID":"e3230517-79be-47ce-83ca-f423e167c5f1","Type":"ContainerDied","Data":"04e3cbdfa59cef665974464f1b89997be6a92c93bb7ca7054c8d40a4897da1bb"} Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.692503 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04e3cbdfa59cef665974464f1b89997be6a92c93bb7ca7054c8d40a4897da1bb" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.692390 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vv8d" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.703058 4909 generic.go:334] "Generic (PLEG): container finished" podID="cb76447a-6677-4bf2-9775-e273f1524050" containerID="52d3a17b8128c4f37aa8403976e03a3ffe8ec8ac90ba72cf60845644fcfc43d2" exitCode=0 Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.704156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerDied","Data":"52d3a17b8128c4f37aa8403976e03a3ffe8ec8ac90ba72cf60845644fcfc43d2"} Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.754617 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 18:42:20 crc kubenswrapper[4909]: E1002 18:42:20.755224 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63441e27-e4a8-4db3-9c60-5cd65f0e136a" containerName="aodh-db-sync" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.755241 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="63441e27-e4a8-4db3-9c60-5cd65f0e136a" containerName="aodh-db-sync" Oct 02 18:42:20 crc kubenswrapper[4909]: E1002 18:42:20.755261 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3230517-79be-47ce-83ca-f423e167c5f1" containerName="nova-cell1-conductor-db-sync" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.755267 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3230517-79be-47ce-83ca-f423e167c5f1" containerName="nova-cell1-conductor-db-sync" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.755618 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3230517-79be-47ce-83ca-f423e167c5f1" containerName="nova-cell1-conductor-db-sync" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.755649 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="63441e27-e4a8-4db3-9c60-5cd65f0e136a" containerName="aodh-db-sync" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.756642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.760940 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.778092 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.820333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e10ad5d-5b1e-47a3-a188-50702d492942-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.820435 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e10ad5d-5b1e-47a3-a188-50702d492942-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.820495 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lfvk\" (UniqueName: \"kubernetes.io/projected/4e10ad5d-5b1e-47a3-a188-50702d492942-kube-api-access-7lfvk\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.885527 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.923188 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lfvk\" (UniqueName: \"kubernetes.io/projected/4e10ad5d-5b1e-47a3-a188-50702d492942-kube-api-access-7lfvk\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.929870 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e10ad5d-5b1e-47a3-a188-50702d492942-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.930146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e10ad5d-5b1e-47a3-a188-50702d492942-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.936206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e10ad5d-5b1e-47a3-a188-50702d492942-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.936981 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e10ad5d-5b1e-47a3-a188-50702d492942-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:20 crc kubenswrapper[4909]: I1002 18:42:20.951575 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lfvk\" (UniqueName: \"kubernetes.io/projected/4e10ad5d-5b1e-47a3-a188-50702d492942-kube-api-access-7lfvk\") pod \"nova-cell1-conductor-0\" (UID: \"4e10ad5d-5b1e-47a3-a188-50702d492942\") " pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.031176 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-catalog-content\") pod \"cb76447a-6677-4bf2-9775-e273f1524050\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.031294 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j4mn\" (UniqueName: \"kubernetes.io/projected/cb76447a-6677-4bf2-9775-e273f1524050-kube-api-access-8j4mn\") pod \"cb76447a-6677-4bf2-9775-e273f1524050\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.031467 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-utilities\") pod \"cb76447a-6677-4bf2-9775-e273f1524050\" (UID: \"cb76447a-6677-4bf2-9775-e273f1524050\") " Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.032422 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-utilities" (OuterVolumeSpecName: "utilities") pod "cb76447a-6677-4bf2-9775-e273f1524050" (UID: "cb76447a-6677-4bf2-9775-e273f1524050"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.034475 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb76447a-6677-4bf2-9775-e273f1524050-kube-api-access-8j4mn" (OuterVolumeSpecName: "kube-api-access-8j4mn") pod "cb76447a-6677-4bf2-9775-e273f1524050" (UID: "cb76447a-6677-4bf2-9775-e273f1524050"). InnerVolumeSpecName "kube-api-access-8j4mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.076799 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.116992 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb76447a-6677-4bf2-9775-e273f1524050" (UID: "cb76447a-6677-4bf2-9775-e273f1524050"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.137243 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j4mn\" (UniqueName: \"kubernetes.io/projected/cb76447a-6677-4bf2-9775-e273f1524050-kube-api-access-8j4mn\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.137501 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.137579 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb76447a-6677-4bf2-9775-e273f1524050-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.713713 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb477" event={"ID":"cb76447a-6677-4bf2-9775-e273f1524050","Type":"ContainerDied","Data":"8eb35fe3063724b9530c85f5af7a5fc1663e46bf522f95e05585b60a51ae3c8f"} Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.714010 4909 scope.go:117] "RemoveContainer" containerID="52d3a17b8128c4f37aa8403976e03a3ffe8ec8ac90ba72cf60845644fcfc43d2" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.713843 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tb477" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.753291 4909 scope.go:117] "RemoveContainer" containerID="a2229e5e472877011137da7d21b00fcba7df94273f66642e5cdab84fcb0e8f0f" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.753912 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tb477"] Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.768881 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tb477"] Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.785336 4909 scope.go:117] "RemoveContainer" containerID="878906bdc0866c50a4d90e1a24224d4ec7aa73ae7c8e18ad8ea3a5ddaf0e1998" Oct 02 18:42:21 crc kubenswrapper[4909]: I1002 18:42:21.880080 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.130478 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.130719 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fe1b0647-0d74-4e52-9793-46dc1fa7d405" containerName="kube-state-metrics" containerID="cri-o://5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae" gracePeriod=30 Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.274877 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.275759 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" containerName="mysqld-exporter" containerID="cri-o://7ec5e3716d9bee9579242e63236dfe08f677ba2dfc853a376e719350061ba460" gracePeriod=30 Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.759653 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.759974 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4e10ad5d-5b1e-47a3-a188-50702d492942","Type":"ContainerStarted","Data":"25c078e33834ab7ec85a88195f2c2557ddce9bca859445a3f970d6191b4c4673"} Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.760011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4e10ad5d-5b1e-47a3-a188-50702d492942","Type":"ContainerStarted","Data":"5204db46656a932e44ff96f9e7b5520f61883504657d292394a5678c54dc34e6"} Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.760274 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.766918 4909 generic.go:334] "Generic (PLEG): container finished" podID="5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" containerID="7ec5e3716d9bee9579242e63236dfe08f677ba2dfc853a376e719350061ba460" exitCode=2 Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.766991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421","Type":"ContainerDied","Data":"7ec5e3716d9bee9579242e63236dfe08f677ba2dfc853a376e719350061ba460"} Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.771570 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe1b0647-0d74-4e52-9793-46dc1fa7d405" containerID="5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae" exitCode=2 Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.771607 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe1b0647-0d74-4e52-9793-46dc1fa7d405","Type":"ContainerDied","Data":"5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae"} Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.771630 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe1b0647-0d74-4e52-9793-46dc1fa7d405","Type":"ContainerDied","Data":"3a85e42a5a1defaa56ef731a7a5b649cbea528897e4b6983804768b28075d310"} Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.771646 4909 scope.go:117] "RemoveContainer" containerID="5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.771653 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.793089 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.831245 4909 scope.go:117] "RemoveContainer" containerID="5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae" Oct 02 18:42:22 crc kubenswrapper[4909]: E1002 18:42:22.831861 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae\": container with ID starting with 5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae not found: ID does not exist" containerID="5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.831919 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae"} err="failed to get container status \"5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae\": rpc error: code = NotFound desc = could not find container \"5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae\": container with ID starting with 5c90220acfd068fa2ba2e17b3d97fdb7ea5587c2de9905cfac135ab663a7c3ae not found: ID does not exist" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.857064 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.857047164 podStartE2EDuration="2.857047164s" podCreationTimestamp="2025-10-02 18:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:22.810310782 +0000 UTC m=+1463.997806631" watchObservedRunningTime="2025-10-02 18:42:22.857047164 +0000 UTC m=+1464.044543023" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.867175 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-combined-ca-bundle\") pod \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.867327 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49df\" (UniqueName: \"kubernetes.io/projected/fe1b0647-0d74-4e52-9793-46dc1fa7d405-kube-api-access-k49df\") pod \"fe1b0647-0d74-4e52-9793-46dc1fa7d405\" (UID: \"fe1b0647-0d74-4e52-9793-46dc1fa7d405\") " Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.867381 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4kdk\" (UniqueName: \"kubernetes.io/projected/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-kube-api-access-d4kdk\") pod \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.867409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-config-data\") pod \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\" (UID: \"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421\") " Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.874254 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-kube-api-access-d4kdk" (OuterVolumeSpecName: "kube-api-access-d4kdk") pod "5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" (UID: "5d8ec777-e2b6-4e9e-a791-3ea7b8c70421"). InnerVolumeSpecName "kube-api-access-d4kdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.875990 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1b0647-0d74-4e52-9793-46dc1fa7d405-kube-api-access-k49df" (OuterVolumeSpecName: "kube-api-access-k49df") pod "fe1b0647-0d74-4e52-9793-46dc1fa7d405" (UID: "fe1b0647-0d74-4e52-9793-46dc1fa7d405"). InnerVolumeSpecName "kube-api-access-k49df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.909741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" (UID: "5d8ec777-e2b6-4e9e-a791-3ea7b8c70421"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.937563 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-config-data" (OuterVolumeSpecName: "config-data") pod "5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" (UID: "5d8ec777-e2b6-4e9e-a791-3ea7b8c70421"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.970232 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.970268 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49df\" (UniqueName: \"kubernetes.io/projected/fe1b0647-0d74-4e52-9793-46dc1fa7d405-kube-api-access-k49df\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.970279 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4kdk\" (UniqueName: \"kubernetes.io/projected/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-kube-api-access-d4kdk\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:22 crc kubenswrapper[4909]: I1002 18:42:22.970287 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.011346 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.087805 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.088000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.111297 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.124166 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.135430 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: E1002 18:42:23.135844 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="extract-utilities" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.135869 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="extract-utilities" Oct 02 18:42:23 crc kubenswrapper[4909]: E1002 18:42:23.135903 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="registry-server" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.135911 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="registry-server" Oct 02 18:42:23 crc kubenswrapper[4909]: E1002 18:42:23.135922 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1b0647-0d74-4e52-9793-46dc1fa7d405" containerName="kube-state-metrics" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.135928 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1b0647-0d74-4e52-9793-46dc1fa7d405" containerName="kube-state-metrics" Oct 02 18:42:23 crc kubenswrapper[4909]: E1002 18:42:23.135941 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" containerName="mysqld-exporter" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.135947 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" containerName="mysqld-exporter" Oct 02 18:42:23 crc kubenswrapper[4909]: E1002 18:42:23.135957 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="extract-content" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.135963 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="extract-content" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.136163 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb76447a-6677-4bf2-9775-e273f1524050" containerName="registry-server" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.136173 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1b0647-0d74-4e52-9793-46dc1fa7d405" containerName="kube-state-metrics" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.136182 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" containerName="mysqld-exporter" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.136963 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.147636 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.147828 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.156238 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.276972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.277084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.277227 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxsl\" (UniqueName: \"kubernetes.io/projected/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-api-access-2lxsl\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.277452 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.379436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.379516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxsl\" (UniqueName: \"kubernetes.io/projected/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-api-access-2lxsl\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.379575 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.379628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.384865 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.386751 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.388787 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.399934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxsl\" (UniqueName: \"kubernetes.io/projected/ccc2628c-e957-4344-b396-d3fe3ddd0da1-kube-api-access-2lxsl\") pod \"kube-state-metrics-0\" (UID: \"ccc2628c-e957-4344-b396-d3fe3ddd0da1\") " pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.455846 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.605582 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.637339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.643269 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-t2tfr" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.645002 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.647981 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.694526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-scripts\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.694644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79v5\" (UniqueName: \"kubernetes.io/projected/6e3d6540-58f9-4213-b9d3-c1f645eb5107-kube-api-access-p79v5\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.695041 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.695115 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-config-data\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.711419 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb76447a-6677-4bf2-9775-e273f1524050" path="/var/lib/kubelet/pods/cb76447a-6677-4bf2-9775-e273f1524050/volumes" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.712016 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe1b0647-0d74-4e52-9793-46dc1fa7d405" path="/var/lib/kubelet/pods/fe1b0647-0d74-4e52-9793-46dc1fa7d405/volumes" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.712868 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.785281 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.785923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5d8ec777-e2b6-4e9e-a791-3ea7b8c70421","Type":"ContainerDied","Data":"6d3157829b8e95d79a5b1be6769bfaae889b3f9eba6837d11b05f4e9b6e44a80"} Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.785946 4909 scope.go:117] "RemoveContainer" containerID="7ec5e3716d9bee9579242e63236dfe08f677ba2dfc853a376e719350061ba460" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.797383 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-scripts\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.797433 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p79v5\" (UniqueName: \"kubernetes.io/projected/6e3d6540-58f9-4213-b9d3-c1f645eb5107-kube-api-access-p79v5\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.797545 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.797569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-config-data\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.816714 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.819003 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-scripts\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.819744 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-config-data\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.833606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p79v5\" (UniqueName: \"kubernetes.io/projected/6e3d6540-58f9-4213-b9d3-c1f645eb5107-kube-api-access-p79v5\") pod \"aodh-0\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.851085 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.876469 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.887359 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.888678 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.891789 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.892004 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.904886 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.963340 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:42:23 crc kubenswrapper[4909]: I1002 18:42:23.984631 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.001671 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkf4c\" (UniqueName: \"kubernetes.io/projected/ed14f597-80d1-41ec-a205-85e2e85173c2-kube-api-access-wkf4c\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.001750 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.001849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.001877 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.103172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.103580 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.103620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.103686 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkf4c\" (UniqueName: \"kubernetes.io/projected/ed14f597-80d1-41ec-a205-85e2e85173c2-kube-api-access-wkf4c\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.115587 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.118349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.118941 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed14f597-80d1-41ec-a205-85e2e85173c2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.130585 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkf4c\" (UniqueName: \"kubernetes.io/projected/ed14f597-80d1-41ec-a205-85e2e85173c2-kube-api-access-wkf4c\") pod \"mysqld-exporter-0\" (UID: \"ed14f597-80d1-41ec-a205-85e2e85173c2\") " pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.230503 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.468336 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 18:42:24 crc kubenswrapper[4909]: W1002 18:42:24.481626 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e3d6540_58f9_4213_b9d3_c1f645eb5107.slice/crio-d23ecbdeecb734f2d58c1428a2259bad9ac12c4d85ddc5220236317c9aa12b4f WatchSource:0}: Error finding container d23ecbdeecb734f2d58c1428a2259bad9ac12c4d85ddc5220236317c9aa12b4f: Status 404 returned error can't find the container with id d23ecbdeecb734f2d58c1428a2259bad9ac12c4d85ddc5220236317c9aa12b4f Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.786113 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 02 18:42:24 crc kubenswrapper[4909]: W1002 18:42:24.788151 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded14f597_80d1_41ec_a205_85e2e85173c2.slice/crio-0250c38ac9aea513ef7ebf2ee75f2b04cad5d19fdad1fe177804ed7d8f2a52e5 WatchSource:0}: Error finding container 0250c38ac9aea513ef7ebf2ee75f2b04cad5d19fdad1fe177804ed7d8f2a52e5: Status 404 returned error can't find the container with id 0250c38ac9aea513ef7ebf2ee75f2b04cad5d19fdad1fe177804ed7d8f2a52e5 Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.795558 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ccc2628c-e957-4344-b396-d3fe3ddd0da1","Type":"ContainerStarted","Data":"238bc77be5fd38b404c55ab84bf6f06513eb55fbac98d755c1b5599a86718612"} Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.795616 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ccc2628c-e957-4344-b396-d3fe3ddd0da1","Type":"ContainerStarted","Data":"46803b8b829ba5bc3360667881c162fc0056a448f82dd1102f7b617ff4defe7f"} Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.795666 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.796783 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerStarted","Data":"d23ecbdeecb734f2d58c1428a2259bad9ac12c4d85ddc5220236317c9aa12b4f"} Oct 02 18:42:24 crc kubenswrapper[4909]: I1002 18:42:24.814986 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.35004031 podStartE2EDuration="1.814966797s" podCreationTimestamp="2025-10-02 18:42:23 +0000 UTC" firstStartedPulling="2025-10-02 18:42:23.988341094 +0000 UTC m=+1465.175836953" lastFinishedPulling="2025-10-02 18:42:24.453267581 +0000 UTC m=+1465.640763440" observedRunningTime="2025-10-02 18:42:24.809536095 +0000 UTC m=+1465.997031954" watchObservedRunningTime="2025-10-02 18:42:24.814966797 +0000 UTC m=+1466.002462646" Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.018809 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.019122 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="proxy-httpd" containerID="cri-o://18fd289cab681d1b85aee0fa477341f650d2ec2921c3848b71442c545bec6afb" gracePeriod=30 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.019147 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="sg-core" containerID="cri-o://ba435c93015d0d32f1ba47dc65036de355fb859da9b8d9f5b78d8c4e6f712460" gracePeriod=30 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.019089 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-central-agent" containerID="cri-o://c6a801bc60f11ab0c0662b7f78d788419233f8d0754312bb33cdee33c2f0185e" gracePeriod=30 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.019242 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-notification-agent" containerID="cri-o://3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d" gracePeriod=30 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.621339 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8ec777-e2b6-4e9e-a791-3ea7b8c70421" path="/var/lib/kubelet/pods/5d8ec777-e2b6-4e9e-a791-3ea7b8c70421/volumes" Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.825693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ed14f597-80d1-41ec-a205-85e2e85173c2","Type":"ContainerStarted","Data":"0250c38ac9aea513ef7ebf2ee75f2b04cad5d19fdad1fe177804ed7d8f2a52e5"} Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.828219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerStarted","Data":"027d03a71e93b7a88e39c9be9b9ebbfdf73e0fcd15095ff412c2f4a13f5e59fe"} Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.831510 4909 generic.go:334] "Generic (PLEG): container finished" podID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerID="18fd289cab681d1b85aee0fa477341f650d2ec2921c3848b71442c545bec6afb" exitCode=0 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.831534 4909 generic.go:334] "Generic (PLEG): container finished" podID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerID="ba435c93015d0d32f1ba47dc65036de355fb859da9b8d9f5b78d8c4e6f712460" exitCode=2 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.831543 4909 generic.go:334] "Generic (PLEG): container finished" podID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerID="c6a801bc60f11ab0c0662b7f78d788419233f8d0754312bb33cdee33c2f0185e" exitCode=0 Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.832633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerDied","Data":"18fd289cab681d1b85aee0fa477341f650d2ec2921c3848b71442c545bec6afb"} Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.832694 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerDied","Data":"ba435c93015d0d32f1ba47dc65036de355fb859da9b8d9f5b78d8c4e6f712460"} Oct 02 18:42:25 crc kubenswrapper[4909]: I1002 18:42:25.832709 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerDied","Data":"c6a801bc60f11ab0c0662b7f78d788419233f8d0754312bb33cdee33c2f0185e"} Oct 02 18:42:26 crc kubenswrapper[4909]: E1002 18:42:26.442600 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f66d6a_589f_4c63_9f82_84bbeee56676.slice/crio-conmon-3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f66d6a_589f_4c63_9f82_84bbeee56676.slice/crio-3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d.scope\": RecentStats: unable to find data in memory cache]" Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.854256 4909 generic.go:334] "Generic (PLEG): container finished" podID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerID="3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d" exitCode=0 Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.854331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerDied","Data":"3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d"} Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.854363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69f66d6a-589f-4c63-9f82-84bbeee56676","Type":"ContainerDied","Data":"ec9ae6a8e7ee201d2056cb9987b8ee4e08317073aaae194a40c59214997db19d"} Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.854375 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9ae6a8e7ee201d2056cb9987b8ee4e08317073aaae194a40c59214997db19d" Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.856467 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ed14f597-80d1-41ec-a205-85e2e85173c2","Type":"ContainerStarted","Data":"fa0aa519c89f3fa4c14d29109d1cb24bf57a68f16ec9468475a74351d2bc5400"} Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.886618 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.870677166 podStartE2EDuration="3.886593331s" podCreationTimestamp="2025-10-02 18:42:23 +0000 UTC" firstStartedPulling="2025-10-02 18:42:24.789891177 +0000 UTC m=+1465.977387036" lastFinishedPulling="2025-10-02 18:42:25.805807342 +0000 UTC m=+1466.993303201" observedRunningTime="2025-10-02 18:42:26.878119184 +0000 UTC m=+1468.065615053" watchObservedRunningTime="2025-10-02 18:42:26.886593331 +0000 UTC m=+1468.074089190" Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.891498 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.972964 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-log-httpd\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.973153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-combined-ca-bundle\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.973285 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-config-data\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.973336 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp987\" (UniqueName: \"kubernetes.io/projected/69f66d6a-589f-4c63-9f82-84bbeee56676-kube-api-access-bp987\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.973391 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-scripts\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.973410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-run-httpd\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.973533 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-sg-core-conf-yaml\") pod \"69f66d6a-589f-4c63-9f82-84bbeee56676\" (UID: \"69f66d6a-589f-4c63-9f82-84bbeee56676\") " Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.975265 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:26 crc kubenswrapper[4909]: I1002 18:42:26.983344 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.062243 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f66d6a-589f-4c63-9f82-84bbeee56676-kube-api-access-bp987" (OuterVolumeSpecName: "kube-api-access-bp987") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "kube-api-access-bp987". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.062332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-scripts" (OuterVolumeSpecName: "scripts") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.078367 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.078400 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp987\" (UniqueName: \"kubernetes.io/projected/69f66d6a-589f-4c63-9f82-84bbeee56676-kube-api-access-bp987\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.078411 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.078418 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69f66d6a-589f-4c63-9f82-84bbeee56676-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.098369 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.182151 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.188610 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.230266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-config-data" (OuterVolumeSpecName: "config-data") pod "69f66d6a-589f-4c63-9f82-84bbeee56676" (UID: "69f66d6a-589f-4c63-9f82-84bbeee56676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.284457 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.284501 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f66d6a-589f-4c63-9f82-84bbeee56676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.292134 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.526713 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.527145 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.866825 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.898012 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.917659 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.930796 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:27 crc kubenswrapper[4909]: E1002 18:42:27.931263 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="sg-core" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931278 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="sg-core" Oct 02 18:42:27 crc kubenswrapper[4909]: E1002 18:42:27.931314 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="proxy-httpd" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931320 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="proxy-httpd" Oct 02 18:42:27 crc kubenswrapper[4909]: E1002 18:42:27.931328 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-central-agent" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931335 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-central-agent" Oct 02 18:42:27 crc kubenswrapper[4909]: E1002 18:42:27.931343 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-notification-agent" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931351 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-notification-agent" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931533 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-notification-agent" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931553 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="sg-core" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931566 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="proxy-httpd" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.931580 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" containerName="ceilometer-central-agent" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.935267 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.939318 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.939984 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.940127 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.940850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-scripts\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998223 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998255 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-run-httpd\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998327 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-log-httpd\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998364 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998389 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998411 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sn4x\" (UniqueName: \"kubernetes.io/projected/fd560b6d-dc90-4758-b724-96389acde0ed-kube-api-access-7sn4x\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:27 crc kubenswrapper[4909]: I1002 18:42:27.998428 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-config-data\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.011084 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.059401 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.088381 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.088465 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.099943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-log-httpd\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100163 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sn4x\" (UniqueName: \"kubernetes.io/projected/fd560b6d-dc90-4758-b724-96389acde0ed-kube-api-access-7sn4x\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100189 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-config-data\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100277 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-scripts\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.100407 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-run-httpd\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.101018 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-run-httpd\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.101399 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-log-httpd\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.106934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.108792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.109395 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-config-data\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.109516 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.122403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-scripts\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.134072 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sn4x\" (UniqueName: \"kubernetes.io/projected/fd560b6d-dc90-4758-b724-96389acde0ed-kube-api-access-7sn4x\") pod \"ceilometer-0\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.284286 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.611272 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.611483 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.825640 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:28 crc kubenswrapper[4909]: W1002 18:42:28.837283 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd560b6d_dc90_4758_b724_96389acde0ed.slice/crio-e7746d6db3a4881d6251d2167be774329b15dd56ddcb653aca39a04b0266a4fe WatchSource:0}: Error finding container e7746d6db3a4881d6251d2167be774329b15dd56ddcb653aca39a04b0266a4fe: Status 404 returned error can't find the container with id e7746d6db3a4881d6251d2167be774329b15dd56ddcb653aca39a04b0266a4fe Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.879324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerStarted","Data":"e7746d6db3a4881d6251d2167be774329b15dd56ddcb653aca39a04b0266a4fe"} Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.882401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerStarted","Data":"b1bdb5b0300b816d1a88c3e450e7b0ebe9ab507647a67389d98ca3444cf20cba"} Oct 02 18:42:28 crc kubenswrapper[4909]: I1002 18:42:28.913652 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 18:42:29 crc kubenswrapper[4909]: I1002 18:42:29.101352 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:29 crc kubenswrapper[4909]: I1002 18:42:29.101672 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:29 crc kubenswrapper[4909]: I1002 18:42:29.630274 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f66d6a-589f-4c63-9f82-84bbeee56676" path="/var/lib/kubelet/pods/69f66d6a-589f-4c63-9f82-84bbeee56676/volumes" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.536040 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5c7ms"] Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.538413 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.558503 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c7ms"] Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.590227 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-catalog-content\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.590405 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-utilities\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.590427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m82zt\" (UniqueName: \"kubernetes.io/projected/733b31a0-f130-49d0-80c6-51a578608fa7-kube-api-access-m82zt\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.691586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-utilities\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.691633 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m82zt\" (UniqueName: \"kubernetes.io/projected/733b31a0-f130-49d0-80c6-51a578608fa7-kube-api-access-m82zt\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.691727 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-catalog-content\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.693625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-catalog-content\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.693932 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-utilities\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.716838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m82zt\" (UniqueName: \"kubernetes.io/projected/733b31a0-f130-49d0-80c6-51a578608fa7-kube-api-access-m82zt\") pod \"redhat-marketplace-5c7ms\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.861004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.916803 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerStarted","Data":"7141d4d0d7dd31d6f65a754876a75efc99d82d1576f3e612305f2dad94e3b0e3"} Oct 02 18:42:30 crc kubenswrapper[4909]: I1002 18:42:30.919340 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerStarted","Data":"b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed"} Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.122697 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.429856 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c7ms"] Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.459382 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.935308 4909 generic.go:334] "Generic (PLEG): container finished" podID="733b31a0-f130-49d0-80c6-51a578608fa7" containerID="19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0" exitCode=0 Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.935419 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c7ms" event={"ID":"733b31a0-f130-49d0-80c6-51a578608fa7","Type":"ContainerDied","Data":"19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0"} Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.935591 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c7ms" event={"ID":"733b31a0-f130-49d0-80c6-51a578608fa7","Type":"ContainerStarted","Data":"06c8ff6e334b09fac0014a8d1f780b0cf4f2d726ec08f8f1799be94c96fb0671"} Oct 02 18:42:31 crc kubenswrapper[4909]: I1002 18:42:31.945184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerStarted","Data":"31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8"} Oct 02 18:42:32 crc kubenswrapper[4909]: I1002 18:42:32.960538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerStarted","Data":"292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40"} Oct 02 18:42:32 crc kubenswrapper[4909]: I1002 18:42:32.963405 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerStarted","Data":"a823801d4bacac69a0d22aaea122bbc7ebe6170cd9197e9ffecb4dc97c9fb3b3"} Oct 02 18:42:32 crc kubenswrapper[4909]: I1002 18:42:32.963713 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-api" containerID="cri-o://027d03a71e93b7a88e39c9be9b9ebbfdf73e0fcd15095ff412c2f4a13f5e59fe" gracePeriod=30 Oct 02 18:42:32 crc kubenswrapper[4909]: I1002 18:42:32.964154 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-notifier" containerID="cri-o://7141d4d0d7dd31d6f65a754876a75efc99d82d1576f3e612305f2dad94e3b0e3" gracePeriod=30 Oct 02 18:42:32 crc kubenswrapper[4909]: I1002 18:42:32.964194 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-listener" containerID="cri-o://a823801d4bacac69a0d22aaea122bbc7ebe6170cd9197e9ffecb4dc97c9fb3b3" gracePeriod=30 Oct 02 18:42:32 crc kubenswrapper[4909]: I1002 18:42:32.964161 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-evaluator" containerID="cri-o://b1bdb5b0300b816d1a88c3e450e7b0ebe9ab507647a67389d98ca3444cf20cba" gracePeriod=30 Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.014955 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.36955368 podStartE2EDuration="10.014939619s" podCreationTimestamp="2025-10-02 18:42:23 +0000 UTC" firstStartedPulling="2025-10-02 18:42:24.485965362 +0000 UTC m=+1465.673461221" lastFinishedPulling="2025-10-02 18:42:32.131351301 +0000 UTC m=+1473.318847160" observedRunningTime="2025-10-02 18:42:33.008638289 +0000 UTC m=+1474.196134148" watchObservedRunningTime="2025-10-02 18:42:33.014939619 +0000 UTC m=+1474.202435478" Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.474217 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.994261 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerID="7141d4d0d7dd31d6f65a754876a75efc99d82d1576f3e612305f2dad94e3b0e3" exitCode=0 Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.994303 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerID="b1bdb5b0300b816d1a88c3e450e7b0ebe9ab507647a67389d98ca3444cf20cba" exitCode=0 Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.994313 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerID="027d03a71e93b7a88e39c9be9b9ebbfdf73e0fcd15095ff412c2f4a13f5e59fe" exitCode=0 Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.994367 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerDied","Data":"7141d4d0d7dd31d6f65a754876a75efc99d82d1576f3e612305f2dad94e3b0e3"} Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.994401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerDied","Data":"b1bdb5b0300b816d1a88c3e450e7b0ebe9ab507647a67389d98ca3444cf20cba"} Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.994414 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerDied","Data":"027d03a71e93b7a88e39c9be9b9ebbfdf73e0fcd15095ff412c2f4a13f5e59fe"} Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.997138 4909 generic.go:334] "Generic (PLEG): container finished" podID="733b31a0-f130-49d0-80c6-51a578608fa7" containerID="b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db" exitCode=0 Oct 02 18:42:33 crc kubenswrapper[4909]: I1002 18:42:33.997173 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c7ms" event={"ID":"733b31a0-f130-49d0-80c6-51a578608fa7","Type":"ContainerDied","Data":"b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db"} Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.011481 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c7ms" event={"ID":"733b31a0-f130-49d0-80c6-51a578608fa7","Type":"ContainerStarted","Data":"f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6"} Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.013993 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerStarted","Data":"bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425"} Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.014176 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-central-agent" containerID="cri-o://b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed" gracePeriod=30 Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.014215 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.014218 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="sg-core" containerID="cri-o://292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40" gracePeriod=30 Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.014265 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="proxy-httpd" containerID="cri-o://bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425" gracePeriod=30 Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.014225 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-notification-agent" containerID="cri-o://31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8" gracePeriod=30 Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.041169 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5c7ms" podStartSLOduration=2.49820481 podStartE2EDuration="5.041152582s" podCreationTimestamp="2025-10-02 18:42:30 +0000 UTC" firstStartedPulling="2025-10-02 18:42:32.068480361 +0000 UTC m=+1473.255976220" lastFinishedPulling="2025-10-02 18:42:34.611428133 +0000 UTC m=+1475.798923992" observedRunningTime="2025-10-02 18:42:35.037065403 +0000 UTC m=+1476.224561262" watchObservedRunningTime="2025-10-02 18:42:35.041152582 +0000 UTC m=+1476.228648441" Oct 02 18:42:35 crc kubenswrapper[4909]: I1002 18:42:35.062798 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.969761643 podStartE2EDuration="8.062783243s" podCreationTimestamp="2025-10-02 18:42:27 +0000 UTC" firstStartedPulling="2025-10-02 18:42:28.839830306 +0000 UTC m=+1470.027326165" lastFinishedPulling="2025-10-02 18:42:33.932851906 +0000 UTC m=+1475.120347765" observedRunningTime="2025-10-02 18:42:35.059423457 +0000 UTC m=+1476.246919326" watchObservedRunningTime="2025-10-02 18:42:35.062783243 +0000 UTC m=+1476.250279102" Oct 02 18:42:36 crc kubenswrapper[4909]: I1002 18:42:36.027389 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd560b6d-dc90-4758-b724-96389acde0ed" containerID="bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425" exitCode=0 Oct 02 18:42:36 crc kubenswrapper[4909]: I1002 18:42:36.027421 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd560b6d-dc90-4758-b724-96389acde0ed" containerID="292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40" exitCode=2 Oct 02 18:42:36 crc kubenswrapper[4909]: I1002 18:42:36.027460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerDied","Data":"bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425"} Oct 02 18:42:36 crc kubenswrapper[4909]: I1002 18:42:36.027490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerDied","Data":"292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40"} Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.039356 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd560b6d-dc90-4758-b724-96389acde0ed" containerID="31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8" exitCode=0 Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.039436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerDied","Data":"31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8"} Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.531580 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.531988 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.532416 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.544249 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 18:42:37 crc kubenswrapper[4909]: I1002 18:42:37.930664 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.053544 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd560b6d-dc90-4758-b724-96389acde0ed" containerID="b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed" exitCode=0 Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.053609 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.053646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerDied","Data":"b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed"} Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.053687 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd560b6d-dc90-4758-b724-96389acde0ed","Type":"ContainerDied","Data":"e7746d6db3a4881d6251d2167be774329b15dd56ddcb653aca39a04b0266a4fe"} Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.053708 4909 scope.go:117] "RemoveContainer" containerID="bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.054406 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.059897 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.060598 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sn4x\" (UniqueName: \"kubernetes.io/projected/fd560b6d-dc90-4758-b724-96389acde0ed-kube-api-access-7sn4x\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.060740 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-sg-core-conf-yaml\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.060828 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-config-data\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.060910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-log-httpd\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.061002 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-run-httpd\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.061051 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-scripts\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.061157 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-ceilometer-tls-certs\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.061186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-combined-ca-bundle\") pod \"fd560b6d-dc90-4758-b724-96389acde0ed\" (UID: \"fd560b6d-dc90-4758-b724-96389acde0ed\") " Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.061686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.061872 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.062732 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.068199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-scripts" (OuterVolumeSpecName: "scripts") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.142903 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd560b6d-dc90-4758-b724-96389acde0ed-kube-api-access-7sn4x" (OuterVolumeSpecName: "kube-api-access-7sn4x") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "kube-api-access-7sn4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.169000 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd560b6d-dc90-4758-b724-96389acde0ed-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.169032 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.169055 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sn4x\" (UniqueName: \"kubernetes.io/projected/fd560b6d-dc90-4758-b724-96389acde0ed-kube-api-access-7sn4x\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.175166 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.190534 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.223297 4909 scope.go:117] "RemoveContainer" containerID="292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.241987 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.270510 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.271891 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.316130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.316630 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-6lc6j"] Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.317088 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="proxy-httpd" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317101 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="proxy-httpd" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.317112 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="sg-core" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317119 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="sg-core" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.317129 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-notification-agent" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317135 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-notification-agent" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.317170 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-central-agent" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317176 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-central-agent" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317359 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-notification-agent" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317375 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="proxy-httpd" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317387 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="sg-core" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.317401 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" containerName="ceilometer-central-agent" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.318516 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.333138 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.338850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-6lc6j"] Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.345528 4909 scope.go:117] "RemoveContainer" containerID="31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.377660 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txcln\" (UniqueName: \"kubernetes.io/projected/cf07891f-01ab-41de-8d62-a85ff3e20102-kube-api-access-txcln\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.377796 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.378313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-config\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.378412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.378485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.378545 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.378628 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.378642 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.383990 4909 scope.go:117] "RemoveContainer" containerID="b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.402146 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-config-data" (OuterVolumeSpecName: "config-data") pod "fd560b6d-dc90-4758-b724-96389acde0ed" (UID: "fd560b6d-dc90-4758-b724-96389acde0ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.410329 4909 scope.go:117] "RemoveContainer" containerID="bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.414413 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425\": container with ID starting with bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425 not found: ID does not exist" containerID="bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.414467 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425"} err="failed to get container status \"bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425\": rpc error: code = NotFound desc = could not find container \"bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425\": container with ID starting with bd809e91348e11f844a52c71e4728bc4e4013037d81b01c87b02471319432425 not found: ID does not exist" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.414495 4909 scope.go:117] "RemoveContainer" containerID="292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.414994 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40\": container with ID starting with 292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40 not found: ID does not exist" containerID="292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.415013 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40"} err="failed to get container status \"292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40\": rpc error: code = NotFound desc = could not find container \"292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40\": container with ID starting with 292c87041ae5e93cff508833868b6d9cc35033ab4979622907d288646ac2ec40 not found: ID does not exist" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.415056 4909 scope.go:117] "RemoveContainer" containerID="31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.418427 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8\": container with ID starting with 31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8 not found: ID does not exist" containerID="31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.418635 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8"} err="failed to get container status \"31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8\": rpc error: code = NotFound desc = could not find container \"31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8\": container with ID starting with 31cd0185eb3619fdd5dc042149cf02dcabf1f6857ef00da7970340308f5ee2b8 not found: ID does not exist" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.418771 4909 scope.go:117] "RemoveContainer" containerID="b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed" Oct 02 18:42:38 crc kubenswrapper[4909]: E1002 18:42:38.421298 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed\": container with ID starting with b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed not found: ID does not exist" containerID="b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.421341 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed"} err="failed to get container status \"b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed\": rpc error: code = NotFound desc = could not find container \"b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed\": container with ID starting with b35394871dbe53589e7c6ea9c0edaaaf26750d12c164f19b9f3b0d0dfec975ed not found: ID does not exist" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-config\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483580 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txcln\" (UniqueName: \"kubernetes.io/projected/cf07891f-01ab-41de-8d62-a85ff3e20102-kube-api-access-txcln\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483616 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.483750 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd560b6d-dc90-4758-b724-96389acde0ed-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.484819 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.484841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.485529 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-config\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.485608 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.486218 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.506869 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txcln\" (UniqueName: \"kubernetes.io/projected/cf07891f-01ab-41de-8d62-a85ff3e20102-kube-api-access-txcln\") pod \"dnsmasq-dns-79b5d74c8c-6lc6j\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.661893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.686863 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.701580 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.724449 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.727799 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.737435 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.737734 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.738161 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.744898 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790765 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-log-httpd\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790837 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790862 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-run-httpd\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790930 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790952 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-scripts\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.790982 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-config-data\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.791030 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tww6\" (UniqueName: \"kubernetes.io/projected/8f6fd096-a538-49e6-8efc-1d35596b75b3-kube-api-access-4tww6\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.893566 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-scripts\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894191 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-config-data\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894257 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tww6\" (UniqueName: \"kubernetes.io/projected/8f6fd096-a538-49e6-8efc-1d35596b75b3-kube-api-access-4tww6\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-log-httpd\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-run-httpd\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894476 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.894515 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.895105 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-log-httpd\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.896071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-run-httpd\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.900860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-config-data\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.902788 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.903006 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.909668 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-scripts\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.916436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tww6\" (UniqueName: \"kubernetes.io/projected/8f6fd096-a538-49e6-8efc-1d35596b75b3-kube-api-access-4tww6\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:38 crc kubenswrapper[4909]: I1002 18:42:38.925130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " pod="openstack/ceilometer-0" Oct 02 18:42:39 crc kubenswrapper[4909]: I1002 18:42:39.084578 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 18:42:39 crc kubenswrapper[4909]: I1002 18:42:39.141420 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:39 crc kubenswrapper[4909]: I1002 18:42:39.237998 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-6lc6j"] Oct 02 18:42:39 crc kubenswrapper[4909]: I1002 18:42:39.620462 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd560b6d-dc90-4758-b724-96389acde0ed" path="/var/lib/kubelet/pods/fd560b6d-dc90-4758-b724-96389acde0ed/volumes" Oct 02 18:42:39 crc kubenswrapper[4909]: I1002 18:42:39.689200 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.080398 4909 generic.go:334] "Generic (PLEG): container finished" podID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerID="91abda2f96b6133b3adc760820d74c635534ddb1a1b872bef5f3aafbe9be557b" exitCode=0 Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.080470 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" event={"ID":"cf07891f-01ab-41de-8d62-a85ff3e20102","Type":"ContainerDied","Data":"91abda2f96b6133b3adc760820d74c635534ddb1a1b872bef5f3aafbe9be557b"} Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.080497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" event={"ID":"cf07891f-01ab-41de-8d62-a85ff3e20102","Type":"ContainerStarted","Data":"2c40dba25d7518b660adbc1776db946ed3f0e20eb4d044dbf78b86fdec7c4c8e"} Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.082665 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerStarted","Data":"5e75642d215dd90725764acc3c06c8a6c5d72d8915011d01acffd7936f7471e9"} Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.484481 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.751917 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.861590 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.861641 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:40 crc kubenswrapper[4909]: I1002 18:42:40.926781 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.095911 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" event={"ID":"cf07891f-01ab-41de-8d62-a85ff3e20102","Type":"ContainerStarted","Data":"ebabeb7a4d04573ba4147fecfd9dc7c2d1652e467e4709ffa1f03b1e63a14a20"} Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.098823 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-log" containerID="cri-o://74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7" gracePeriod=30 Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.098937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerStarted","Data":"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd"} Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.100830 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-api" containerID="cri-o://847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77" gracePeriod=30 Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.131437 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" podStartSLOduration=3.131412229 podStartE2EDuration="3.131412229s" podCreationTimestamp="2025-10-02 18:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:41.118991888 +0000 UTC m=+1482.306487787" watchObservedRunningTime="2025-10-02 18:42:41.131412229 +0000 UTC m=+1482.318908108" Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.192006 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:41 crc kubenswrapper[4909]: I1002 18:42:41.278296 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c7ms"] Oct 02 18:42:42 crc kubenswrapper[4909]: I1002 18:42:42.111953 4909 generic.go:334] "Generic (PLEG): container finished" podID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerID="74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7" exitCode=143 Oct 02 18:42:42 crc kubenswrapper[4909]: I1002 18:42:42.112073 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dde70b73-df57-4cdd-b3ab-ddbf98a80f65","Type":"ContainerDied","Data":"74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7"} Oct 02 18:42:42 crc kubenswrapper[4909]: I1002 18:42:42.113067 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:43 crc kubenswrapper[4909]: I1002 18:42:43.124904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerStarted","Data":"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a"} Oct 02 18:42:43 crc kubenswrapper[4909]: I1002 18:42:43.125050 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5c7ms" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="registry-server" containerID="cri-o://f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6" gracePeriod=2 Oct 02 18:42:43 crc kubenswrapper[4909]: I1002 18:42:43.830299 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.018419 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.027800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m82zt\" (UniqueName: \"kubernetes.io/projected/733b31a0-f130-49d0-80c6-51a578608fa7-kube-api-access-m82zt\") pod \"733b31a0-f130-49d0-80c6-51a578608fa7\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.028009 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-utilities\") pod \"733b31a0-f130-49d0-80c6-51a578608fa7\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.028070 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-catalog-content\") pod \"733b31a0-f130-49d0-80c6-51a578608fa7\" (UID: \"733b31a0-f130-49d0-80c6-51a578608fa7\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.029731 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-utilities" (OuterVolumeSpecName: "utilities") pod "733b31a0-f130-49d0-80c6-51a578608fa7" (UID: "733b31a0-f130-49d0-80c6-51a578608fa7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.044960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733b31a0-f130-49d0-80c6-51a578608fa7-kube-api-access-m82zt" (OuterVolumeSpecName: "kube-api-access-m82zt") pod "733b31a0-f130-49d0-80c6-51a578608fa7" (UID: "733b31a0-f130-49d0-80c6-51a578608fa7"). InnerVolumeSpecName "kube-api-access-m82zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.050605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "733b31a0-f130-49d0-80c6-51a578608fa7" (UID: "733b31a0-f130-49d0-80c6-51a578608fa7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.129645 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-combined-ca-bundle\") pod \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.129698 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-config-data\") pod \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.129728 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwqc8\" (UniqueName: \"kubernetes.io/projected/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-kube-api-access-rwqc8\") pod \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\" (UID: \"08c76dbf-4440-4ec1-98a3-14667d5c8a2a\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.130175 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m82zt\" (UniqueName: \"kubernetes.io/projected/733b31a0-f130-49d0-80c6-51a578608fa7-kube-api-access-m82zt\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.130187 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.130198 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733b31a0-f130-49d0-80c6-51a578608fa7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.135964 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-kube-api-access-rwqc8" (OuterVolumeSpecName: "kube-api-access-rwqc8") pod "08c76dbf-4440-4ec1-98a3-14667d5c8a2a" (UID: "08c76dbf-4440-4ec1-98a3-14667d5c8a2a"). InnerVolumeSpecName "kube-api-access-rwqc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.142401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerStarted","Data":"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600"} Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.145358 4909 generic.go:334] "Generic (PLEG): container finished" podID="733b31a0-f130-49d0-80c6-51a578608fa7" containerID="f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6" exitCode=0 Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.145416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c7ms" event={"ID":"733b31a0-f130-49d0-80c6-51a578608fa7","Type":"ContainerDied","Data":"f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6"} Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.145439 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c7ms" event={"ID":"733b31a0-f130-49d0-80c6-51a578608fa7","Type":"ContainerDied","Data":"06c8ff6e334b09fac0014a8d1f780b0cf4f2d726ec08f8f1799be94c96fb0671"} Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.145457 4909 scope.go:117] "RemoveContainer" containerID="f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.145635 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c7ms" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.152778 4909 generic.go:334] "Generic (PLEG): container finished" podID="08c76dbf-4440-4ec1-98a3-14667d5c8a2a" containerID="4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630" exitCode=137 Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.152810 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08c76dbf-4440-4ec1-98a3-14667d5c8a2a","Type":"ContainerDied","Data":"4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630"} Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.152831 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08c76dbf-4440-4ec1-98a3-14667d5c8a2a","Type":"ContainerDied","Data":"53c81099f054963c0b842b16900cccd727e0b22fd5d06538f898d6c0a2fe5ab5"} Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.152849 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.160745 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-config-data" (OuterVolumeSpecName: "config-data") pod "08c76dbf-4440-4ec1-98a3-14667d5c8a2a" (UID: "08c76dbf-4440-4ec1-98a3-14667d5c8a2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.167633 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c76dbf-4440-4ec1-98a3-14667d5c8a2a" (UID: "08c76dbf-4440-4ec1-98a3-14667d5c8a2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.175643 4909 scope.go:117] "RemoveContainer" containerID="b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.188674 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c7ms"] Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.198070 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c7ms"] Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.212258 4909 scope.go:117] "RemoveContainer" containerID="19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.232007 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwqc8\" (UniqueName: \"kubernetes.io/projected/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-kube-api-access-rwqc8\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.232064 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.232099 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c76dbf-4440-4ec1-98a3-14667d5c8a2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.234004 4909 scope.go:117] "RemoveContainer" containerID="f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.234705 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6\": container with ID starting with f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6 not found: ID does not exist" containerID="f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.234743 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6"} err="failed to get container status \"f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6\": rpc error: code = NotFound desc = could not find container \"f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6\": container with ID starting with f0fa1da556b50b600846a2b7aa20c16bb34ef96c6603aef79ecb8f31ef9b74b6 not found: ID does not exist" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.234764 4909 scope.go:117] "RemoveContainer" containerID="b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.235227 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db\": container with ID starting with b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db not found: ID does not exist" containerID="b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.235250 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db"} err="failed to get container status \"b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db\": rpc error: code = NotFound desc = could not find container \"b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db\": container with ID starting with b1b8cadd66caee9266289fef1b54eca0763319032dc7a74180134d02c97c40db not found: ID does not exist" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.235263 4909 scope.go:117] "RemoveContainer" containerID="19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.235517 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0\": container with ID starting with 19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0 not found: ID does not exist" containerID="19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.235552 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0"} err="failed to get container status \"19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0\": rpc error: code = NotFound desc = could not find container \"19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0\": container with ID starting with 19c817f118f3a0d56b75b5647f37eccf54ca607ed78edfdd25ba60d331600bd0 not found: ID does not exist" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.235579 4909 scope.go:117] "RemoveContainer" containerID="4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.262555 4909 scope.go:117] "RemoveContainer" containerID="4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.263112 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630\": container with ID starting with 4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630 not found: ID does not exist" containerID="4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.263184 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630"} err="failed to get container status \"4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630\": rpc error: code = NotFound desc = could not find container \"4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630\": container with ID starting with 4791d789a980a300519b9859cd863e1815e56a294328e71c7d710fac4c06d630 not found: ID does not exist" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.482926 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.492457 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.507595 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.508259 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="extract-utilities" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.508279 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="extract-utilities" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.508311 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="extract-content" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.508318 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="extract-content" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.508346 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c76dbf-4440-4ec1-98a3-14667d5c8a2a" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.508353 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c76dbf-4440-4ec1-98a3-14667d5c8a2a" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 18:42:44 crc kubenswrapper[4909]: E1002 18:42:44.508364 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="registry-server" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.508371 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="registry-server" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.509181 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" containerName="registry-server" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.509284 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c76dbf-4440-4ec1-98a3-14667d5c8a2a" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.510039 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.514429 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.514728 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.514842 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.531213 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.659805 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.659861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.660010 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.660237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lpcx\" (UniqueName: \"kubernetes.io/projected/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-kube-api-access-5lpcx\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.660307 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.762163 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.762500 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lpcx\" (UniqueName: \"kubernetes.io/projected/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-kube-api-access-5lpcx\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.763006 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.763469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.763755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.767725 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.771494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.771647 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.775523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.781973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lpcx\" (UniqueName: \"kubernetes.io/projected/fe17b6af-3b5c-45b6-b6d3-140b3873c81d-kube-api-access-5lpcx\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe17b6af-3b5c-45b6-b6d3-140b3873c81d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.840000 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.846187 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.967776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-logs\") pod \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.968128 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-combined-ca-bundle\") pod \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.968158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wks8g\" (UniqueName: \"kubernetes.io/projected/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-kube-api-access-wks8g\") pod \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.968204 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-config-data\") pod \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\" (UID: \"dde70b73-df57-4cdd-b3ab-ddbf98a80f65\") " Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.969332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-logs" (OuterVolumeSpecName: "logs") pod "dde70b73-df57-4cdd-b3ab-ddbf98a80f65" (UID: "dde70b73-df57-4cdd-b3ab-ddbf98a80f65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:44 crc kubenswrapper[4909]: I1002 18:42:44.977221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-kube-api-access-wks8g" (OuterVolumeSpecName: "kube-api-access-wks8g") pod "dde70b73-df57-4cdd-b3ab-ddbf98a80f65" (UID: "dde70b73-df57-4cdd-b3ab-ddbf98a80f65"). InnerVolumeSpecName "kube-api-access-wks8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.007709 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dde70b73-df57-4cdd-b3ab-ddbf98a80f65" (UID: "dde70b73-df57-4cdd-b3ab-ddbf98a80f65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.024232 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-config-data" (OuterVolumeSpecName: "config-data") pod "dde70b73-df57-4cdd-b3ab-ddbf98a80f65" (UID: "dde70b73-df57-4cdd-b3ab-ddbf98a80f65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.070228 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.070281 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.070293 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wks8g\" (UniqueName: \"kubernetes.io/projected/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-kube-api-access-wks8g\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.070302 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde70b73-df57-4cdd-b3ab-ddbf98a80f65-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.168397 4909 generic.go:334] "Generic (PLEG): container finished" podID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerID="847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77" exitCode=0 Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.168476 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dde70b73-df57-4cdd-b3ab-ddbf98a80f65","Type":"ContainerDied","Data":"847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77"} Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.168511 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dde70b73-df57-4cdd-b3ab-ddbf98a80f65","Type":"ContainerDied","Data":"285b1b2db35e24f4be64e8b09a0610286e028d86b8d1722b1f83d676726014fc"} Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.168533 4909 scope.go:117] "RemoveContainer" containerID="847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.168675 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.198238 4909 scope.go:117] "RemoveContainer" containerID="74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.228926 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.242559 4909 scope.go:117] "RemoveContainer" containerID="847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77" Oct 02 18:42:45 crc kubenswrapper[4909]: E1002 18:42:45.243597 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77\": container with ID starting with 847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77 not found: ID does not exist" containerID="847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.243631 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77"} err="failed to get container status \"847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77\": rpc error: code = NotFound desc = could not find container \"847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77\": container with ID starting with 847f3a8845c87016d845cfb711c7c8f6888feced64096a4d1a08598d82402f77 not found: ID does not exist" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.243665 4909 scope.go:117] "RemoveContainer" containerID="74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7" Oct 02 18:42:45 crc kubenswrapper[4909]: E1002 18:42:45.244531 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7\": container with ID starting with 74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7 not found: ID does not exist" containerID="74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.244561 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7"} err="failed to get container status \"74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7\": rpc error: code = NotFound desc = could not find container \"74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7\": container with ID starting with 74a5a0962c97c7bca23a334c29884a6b4d789111cd7f2b773f57cc2eeb7db0c7 not found: ID does not exist" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.245702 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.255054 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:45 crc kubenswrapper[4909]: E1002 18:42:45.255674 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-api" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.255698 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-api" Oct 02 18:42:45 crc kubenswrapper[4909]: E1002 18:42:45.255742 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-log" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.255750 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-log" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.256069 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-log" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.256092 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" containerName="nova-api-api" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.257725 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.260919 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.261074 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.261165 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.263749 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.348455 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 18:42:45 crc kubenswrapper[4909]: W1002 18:42:45.357752 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe17b6af_3b5c_45b6_b6d3_140b3873c81d.slice/crio-adb69efec6adf45197701cc360fa60eed21b756fd4d037dd56e7c8e0d4cce355 WatchSource:0}: Error finding container adb69efec6adf45197701cc360fa60eed21b756fd4d037dd56e7c8e0d4cce355: Status 404 returned error can't find the container with id adb69efec6adf45197701cc360fa60eed21b756fd4d037dd56e7c8e0d4cce355 Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.377215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cb85597-15db-43a3-9634-7dec6ecaf813-logs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.377294 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.377368 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4j8w\" (UniqueName: \"kubernetes.io/projected/2cb85597-15db-43a3-9634-7dec6ecaf813-kube-api-access-k4j8w\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.377492 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.377706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-config-data\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.377753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.478998 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4j8w\" (UniqueName: \"kubernetes.io/projected/2cb85597-15db-43a3-9634-7dec6ecaf813-kube-api-access-k4j8w\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.479064 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.479136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-config-data\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.479164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.479190 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cb85597-15db-43a3-9634-7dec6ecaf813-logs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.479245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.479750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cb85597-15db-43a3-9634-7dec6ecaf813-logs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.486103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.486376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.488827 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.489418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-config-data\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.503101 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4j8w\" (UniqueName: \"kubernetes.io/projected/2cb85597-15db-43a3-9634-7dec6ecaf813-kube-api-access-k4j8w\") pod \"nova-api-0\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.581671 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.627223 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c76dbf-4440-4ec1-98a3-14667d5c8a2a" path="/var/lib/kubelet/pods/08c76dbf-4440-4ec1-98a3-14667d5c8a2a/volumes" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.627977 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733b31a0-f130-49d0-80c6-51a578608fa7" path="/var/lib/kubelet/pods/733b31a0-f130-49d0-80c6-51a578608fa7/volumes" Oct 02 18:42:45 crc kubenswrapper[4909]: I1002 18:42:45.628916 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde70b73-df57-4cdd-b3ab-ddbf98a80f65" path="/var/lib/kubelet/pods/dde70b73-df57-4cdd-b3ab-ddbf98a80f65/volumes" Oct 02 18:42:46 crc kubenswrapper[4909]: W1002 18:42:46.062836 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cb85597_15db_43a3_9634_7dec6ecaf813.slice/crio-ca7a7ab5099116915b4a32817fb0ca78831bd22d4dd713b4746fb69c22babb0a WatchSource:0}: Error finding container ca7a7ab5099116915b4a32817fb0ca78831bd22d4dd713b4746fb69c22babb0a: Status 404 returned error can't find the container with id ca7a7ab5099116915b4a32817fb0ca78831bd22d4dd713b4746fb69c22babb0a Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.074706 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.193092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cb85597-15db-43a3-9634-7dec6ecaf813","Type":"ContainerStarted","Data":"ca7a7ab5099116915b4a32817fb0ca78831bd22d4dd713b4746fb69c22babb0a"} Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.195634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe17b6af-3b5c-45b6-b6d3-140b3873c81d","Type":"ContainerStarted","Data":"1e40dded847b9ede2253758359626a429d78a941f9f100ede2006333d44a8580"} Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.195695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe17b6af-3b5c-45b6-b6d3-140b3873c81d","Type":"ContainerStarted","Data":"adb69efec6adf45197701cc360fa60eed21b756fd4d037dd56e7c8e0d4cce355"} Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.199702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerStarted","Data":"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480"} Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.199844 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-central-agent" containerID="cri-o://9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" gracePeriod=30 Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.199991 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.200055 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="proxy-httpd" containerID="cri-o://670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" gracePeriod=30 Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.200093 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="sg-core" containerID="cri-o://96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" gracePeriod=30 Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.200125 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-notification-agent" containerID="cri-o://9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" gracePeriod=30 Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.223106 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.223083627 podStartE2EDuration="2.223083627s" podCreationTimestamp="2025-10-02 18:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:46.212907506 +0000 UTC m=+1487.400403365" watchObservedRunningTime="2025-10-02 18:42:46.223083627 +0000 UTC m=+1487.410579486" Oct 02 18:42:46 crc kubenswrapper[4909]: I1002 18:42:46.251808 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.449900788 podStartE2EDuration="8.251789421s" podCreationTimestamp="2025-10-02 18:42:38 +0000 UTC" firstStartedPulling="2025-10-02 18:42:39.693398836 +0000 UTC m=+1480.880894695" lastFinishedPulling="2025-10-02 18:42:45.495287479 +0000 UTC m=+1486.682783328" observedRunningTime="2025-10-02 18:42:46.247366012 +0000 UTC m=+1487.434861891" watchObservedRunningTime="2025-10-02 18:42:46.251789421 +0000 UTC m=+1487.439285280" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.077280 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114502 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-run-httpd\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114567 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-scripts\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114656 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-sg-core-conf-yaml\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114677 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-log-httpd\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114718 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tww6\" (UniqueName: \"kubernetes.io/projected/8f6fd096-a538-49e6-8efc-1d35596b75b3-kube-api-access-4tww6\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114747 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-ceilometer-tls-certs\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114771 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-combined-ca-bundle\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114809 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-config-data\") pod \"8f6fd096-a538-49e6-8efc-1d35596b75b3\" (UID: \"8f6fd096-a538-49e6-8efc-1d35596b75b3\") " Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.114977 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.115345 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.115370 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.120532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-scripts" (OuterVolumeSpecName: "scripts") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.120805 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6fd096-a538-49e6-8efc-1d35596b75b3-kube-api-access-4tww6" (OuterVolumeSpecName: "kube-api-access-4tww6") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "kube-api-access-4tww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.149117 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.179963 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.203341 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.211575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cb85597-15db-43a3-9634-7dec6ecaf813","Type":"ContainerStarted","Data":"853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.211621 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cb85597-15db-43a3-9634-7dec6ecaf813","Type":"ContainerStarted","Data":"13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215288 4909 generic.go:334] "Generic (PLEG): container finished" podID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" exitCode=0 Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215325 4909 generic.go:334] "Generic (PLEG): container finished" podID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" exitCode=2 Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215335 4909 generic.go:334] "Generic (PLEG): container finished" podID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" exitCode=0 Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215347 4909 generic.go:334] "Generic (PLEG): container finished" podID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" exitCode=0 Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215348 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215388 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerDied","Data":"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerDied","Data":"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerDied","Data":"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerDied","Data":"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215469 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8f6fd096-a538-49e6-8efc-1d35596b75b3","Type":"ContainerDied","Data":"5e75642d215dd90725764acc3c06c8a6c5d72d8915011d01acffd7936f7471e9"} Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.215485 4909 scope.go:117] "RemoveContainer" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.217082 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.217100 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f6fd096-a538-49e6-8efc-1d35596b75b3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.217128 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tww6\" (UniqueName: \"kubernetes.io/projected/8f6fd096-a538-49e6-8efc-1d35596b75b3-kube-api-access-4tww6\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.217137 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.217145 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.217155 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.231277 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-config-data" (OuterVolumeSpecName: "config-data") pod "8f6fd096-a538-49e6-8efc-1d35596b75b3" (UID: "8f6fd096-a538-49e6-8efc-1d35596b75b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.247553 4909 scope.go:117] "RemoveContainer" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.265804 4909 scope.go:117] "RemoveContainer" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.284356 4909 scope.go:117] "RemoveContainer" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.302142 4909 scope.go:117] "RemoveContainer" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.302629 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": container with ID starting with 670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480 not found: ID does not exist" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.302668 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480"} err="failed to get container status \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": rpc error: code = NotFound desc = could not find container \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": container with ID starting with 670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.302695 4909 scope.go:117] "RemoveContainer" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.302975 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": container with ID starting with 96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600 not found: ID does not exist" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303003 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600"} err="failed to get container status \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": rpc error: code = NotFound desc = could not find container \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": container with ID starting with 96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303019 4909 scope.go:117] "RemoveContainer" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.303283 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": container with ID starting with 9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a not found: ID does not exist" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303329 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a"} err="failed to get container status \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": rpc error: code = NotFound desc = could not find container \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": container with ID starting with 9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303345 4909 scope.go:117] "RemoveContainer" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.303556 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": container with ID starting with 9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd not found: ID does not exist" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303579 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd"} err="failed to get container status \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": rpc error: code = NotFound desc = could not find container \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": container with ID starting with 9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303593 4909 scope.go:117] "RemoveContainer" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303777 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480"} err="failed to get container status \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": rpc error: code = NotFound desc = could not find container \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": container with ID starting with 670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303797 4909 scope.go:117] "RemoveContainer" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303963 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600"} err="failed to get container status \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": rpc error: code = NotFound desc = could not find container \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": container with ID starting with 96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.303984 4909 scope.go:117] "RemoveContainer" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304230 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a"} err="failed to get container status \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": rpc error: code = NotFound desc = could not find container \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": container with ID starting with 9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304250 4909 scope.go:117] "RemoveContainer" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304401 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd"} err="failed to get container status \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": rpc error: code = NotFound desc = could not find container \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": container with ID starting with 9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304421 4909 scope.go:117] "RemoveContainer" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304592 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480"} err="failed to get container status \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": rpc error: code = NotFound desc = could not find container \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": container with ID starting with 670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304611 4909 scope.go:117] "RemoveContainer" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304765 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600"} err="failed to get container status \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": rpc error: code = NotFound desc = could not find container \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": container with ID starting with 96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304786 4909 scope.go:117] "RemoveContainer" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304937 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a"} err="failed to get container status \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": rpc error: code = NotFound desc = could not find container \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": container with ID starting with 9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.304959 4909 scope.go:117] "RemoveContainer" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305126 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd"} err="failed to get container status \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": rpc error: code = NotFound desc = could not find container \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": container with ID starting with 9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305143 4909 scope.go:117] "RemoveContainer" containerID="670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305298 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480"} err="failed to get container status \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": rpc error: code = NotFound desc = could not find container \"670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480\": container with ID starting with 670b7dd368deec85b0c5f1203ccc55f2e1e11a159aa95ca62b7a8da4db622480 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305320 4909 scope.go:117] "RemoveContainer" containerID="96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305478 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600"} err="failed to get container status \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": rpc error: code = NotFound desc = could not find container \"96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600\": container with ID starting with 96dfd2183fbf657e7b73a134ba7fe859c82868a681243f87345f7012646d3600 not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305494 4909 scope.go:117] "RemoveContainer" containerID="9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305665 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a"} err="failed to get container status \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": rpc error: code = NotFound desc = could not find container \"9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a\": container with ID starting with 9306a4a0734d9a8f2ff1e1a0959f9164d14afce5996fd764ab741b4252da398a not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305684 4909 scope.go:117] "RemoveContainer" containerID="9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.305833 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd"} err="failed to get container status \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": rpc error: code = NotFound desc = could not find container \"9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd\": container with ID starting with 9e90a77406f8b2852635a7b243c9b5e702cd5ffa82ccb4b00452b2b64cbb87cd not found: ID does not exist" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.318479 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6fd096-a538-49e6-8efc-1d35596b75b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.543340 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.54331896 podStartE2EDuration="2.54331896s" podCreationTimestamp="2025-10-02 18:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:47.236502793 +0000 UTC m=+1488.423998652" watchObservedRunningTime="2025-10-02 18:42:47.54331896 +0000 UTC m=+1488.730814819" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.553705 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.564262 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587152 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.587591 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-central-agent" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587608 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-central-agent" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.587626 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="sg-core" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587633 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="sg-core" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.587649 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="proxy-httpd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587654 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="proxy-httpd" Oct 02 18:42:47 crc kubenswrapper[4909]: E1002 18:42:47.587697 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-notification-agent" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587704 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-notification-agent" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587874 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-notification-agent" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587897 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="proxy-httpd" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587909 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="ceilometer-central-agent" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.587921 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" containerName="sg-core" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.589877 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.593970 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.595204 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.595463 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.600202 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.623459 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.623804 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-config-data\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.623862 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-run-httpd\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.623951 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-scripts\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.624010 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz48j\" (UniqueName: \"kubernetes.io/projected/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-kube-api-access-mz48j\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.624051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.624096 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.624187 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-log-httpd\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.638226 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6fd096-a538-49e6-8efc-1d35596b75b3" path="/var/lib/kubelet/pods/8f6fd096-a538-49e6-8efc-1d35596b75b3/volumes" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-scripts\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725605 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz48j\" (UniqueName: \"kubernetes.io/projected/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-kube-api-access-mz48j\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725705 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-log-httpd\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725758 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-config-data\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.725787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-run-httpd\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.726322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-run-httpd\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.726925 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-log-httpd\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.728860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-scripts\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.729951 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.731408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.740689 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.744563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz48j\" (UniqueName: \"kubernetes.io/projected/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-kube-api-access-mz48j\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.745406 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-config-data\") pod \"ceilometer-0\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " pod="openstack/ceilometer-0" Oct 02 18:42:47 crc kubenswrapper[4909]: I1002 18:42:47.922474 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:42:48 crc kubenswrapper[4909]: W1002 18:42:48.406411 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1017cdd_5bff_4c66_a5fb_5a2e0c1a683e.slice/crio-8ab7ad505111519821e67091f6fdf0a639ec85ef7d1759ff0ad6732ee08e1293 WatchSource:0}: Error finding container 8ab7ad505111519821e67091f6fdf0a639ec85ef7d1759ff0ad6732ee08e1293: Status 404 returned error can't find the container with id 8ab7ad505111519821e67091f6fdf0a639ec85ef7d1759ff0ad6732ee08e1293 Oct 02 18:42:48 crc kubenswrapper[4909]: I1002 18:42:48.407133 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:42:48 crc kubenswrapper[4909]: I1002 18:42:48.664268 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:42:48 crc kubenswrapper[4909]: I1002 18:42:48.735430 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-h9fp2"] Oct 02 18:42:48 crc kubenswrapper[4909]: I1002 18:42:48.735834 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerName="dnsmasq-dns" containerID="cri-o://bfd9a14a1e83428e9662597c47b9b0e06c230cc77bd72f9aaf956efa8e5f0cbd" gracePeriod=10 Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.251049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerStarted","Data":"8ab7ad505111519821e67091f6fdf0a639ec85ef7d1759ff0ad6732ee08e1293"} Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.253495 4909 generic.go:334] "Generic (PLEG): container finished" podID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerID="bfd9a14a1e83428e9662597c47b9b0e06c230cc77bd72f9aaf956efa8e5f0cbd" exitCode=0 Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.253646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" event={"ID":"fbb7a8f4-dd8d-4940-a84e-28364cb8c285","Type":"ContainerDied","Data":"bfd9a14a1e83428e9662597c47b9b0e06c230cc77bd72f9aaf956efa8e5f0cbd"} Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.381213 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.575405 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9qr\" (UniqueName: \"kubernetes.io/projected/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-kube-api-access-7d9qr\") pod \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.575795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-nb\") pod \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.575934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-sb\") pod \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.575972 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-config\") pod \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.576047 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-swift-storage-0\") pod \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.576216 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-svc\") pod \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\" (UID: \"fbb7a8f4-dd8d-4940-a84e-28364cb8c285\") " Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.579831 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-kube-api-access-7d9qr" (OuterVolumeSpecName: "kube-api-access-7d9qr") pod "fbb7a8f4-dd8d-4940-a84e-28364cb8c285" (UID: "fbb7a8f4-dd8d-4940-a84e-28364cb8c285"). InnerVolumeSpecName "kube-api-access-7d9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.641480 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fbb7a8f4-dd8d-4940-a84e-28364cb8c285" (UID: "fbb7a8f4-dd8d-4940-a84e-28364cb8c285"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.646654 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fbb7a8f4-dd8d-4940-a84e-28364cb8c285" (UID: "fbb7a8f4-dd8d-4940-a84e-28364cb8c285"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.649637 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fbb7a8f4-dd8d-4940-a84e-28364cb8c285" (UID: "fbb7a8f4-dd8d-4940-a84e-28364cb8c285"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.657704 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fbb7a8f4-dd8d-4940-a84e-28364cb8c285" (UID: "fbb7a8f4-dd8d-4940-a84e-28364cb8c285"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.663247 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-config" (OuterVolumeSpecName: "config") pod "fbb7a8f4-dd8d-4940-a84e-28364cb8c285" (UID: "fbb7a8f4-dd8d-4940-a84e-28364cb8c285"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.678465 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.678497 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.678507 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.678518 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.678527 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9qr\" (UniqueName: \"kubernetes.io/projected/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-kube-api-access-7d9qr\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.678536 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb7a8f4-dd8d-4940-a84e-28364cb8c285-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:42:49 crc kubenswrapper[4909]: I1002 18:42:49.840974 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.267879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerStarted","Data":"0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892"} Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.268156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerStarted","Data":"31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb"} Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.270059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" event={"ID":"fbb7a8f4-dd8d-4940-a84e-28364cb8c285","Type":"ContainerDied","Data":"c407cb301a87ef69454df0e66a0a97408004d5c52802821ad8bedb32697ca1ef"} Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.270126 4909 scope.go:117] "RemoveContainer" containerID="bfd9a14a1e83428e9662597c47b9b0e06c230cc77bd72f9aaf956efa8e5f0cbd" Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.270151 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-h9fp2" Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.294761 4909 scope.go:117] "RemoveContainer" containerID="6bda7cb8182b2bd0adc961780af2d54dd02409e69449543d8b01e2ff2c07c140" Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.319621 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-h9fp2"] Oct 02 18:42:50 crc kubenswrapper[4909]: I1002 18:42:50.331933 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-h9fp2"] Oct 02 18:42:51 crc kubenswrapper[4909]: I1002 18:42:51.289459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerStarted","Data":"eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d"} Oct 02 18:42:51 crc kubenswrapper[4909]: I1002 18:42:51.625898 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" path="/var/lib/kubelet/pods/fbb7a8f4-dd8d-4940-a84e-28364cb8c285/volumes" Oct 02 18:42:53 crc kubenswrapper[4909]: I1002 18:42:53.356010 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerStarted","Data":"10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09"} Oct 02 18:42:53 crc kubenswrapper[4909]: I1002 18:42:53.356575 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:42:53 crc kubenswrapper[4909]: I1002 18:42:53.406380 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4561320110000002 podStartE2EDuration="6.406360529s" podCreationTimestamp="2025-10-02 18:42:47 +0000 UTC" firstStartedPulling="2025-10-02 18:42:48.41021172 +0000 UTC m=+1489.597707589" lastFinishedPulling="2025-10-02 18:42:52.360440248 +0000 UTC m=+1493.547936107" observedRunningTime="2025-10-02 18:42:53.394902517 +0000 UTC m=+1494.582398416" watchObservedRunningTime="2025-10-02 18:42:53.406360529 +0000 UTC m=+1494.593856388" Oct 02 18:42:54 crc kubenswrapper[4909]: I1002 18:42:54.841140 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:54 crc kubenswrapper[4909]: I1002 18:42:54.865219 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.399087 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.582253 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.582585 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.620404 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jl4jd"] Oct 02 18:42:55 crc kubenswrapper[4909]: E1002 18:42:55.620771 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerName="init" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.620785 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerName="init" Oct 02 18:42:55 crc kubenswrapper[4909]: E1002 18:42:55.620804 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerName="dnsmasq-dns" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.620811 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerName="dnsmasq-dns" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.621090 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb7a8f4-dd8d-4940-a84e-28364cb8c285" containerName="dnsmasq-dns" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.621970 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.626914 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jl4jd"] Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.628476 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.628770 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.819153 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-scripts\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.819246 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2r28\" (UniqueName: \"kubernetes.io/projected/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-kube-api-access-q2r28\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.819310 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-config-data\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.819344 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.921498 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.921678 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-scripts\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.922452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2r28\" (UniqueName: \"kubernetes.io/projected/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-kube-api-access-q2r28\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.922545 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-config-data\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.930182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-scripts\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.931637 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-config-data\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.933998 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.946300 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2r28\" (UniqueName: \"kubernetes.io/projected/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-kube-api-access-q2r28\") pod \"nova-cell1-cell-mapping-jl4jd\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:55 crc kubenswrapper[4909]: I1002 18:42:55.960060 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:42:56 crc kubenswrapper[4909]: I1002 18:42:56.491798 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jl4jd"] Oct 02 18:42:56 crc kubenswrapper[4909]: W1002 18:42:56.504520 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbce99eed_6e71_4b26_9ea2_5f5cd06cda4e.slice/crio-9b45b3e34c8af99246733ace165185eaa322d0b955764aefd7a36537020874e6 WatchSource:0}: Error finding container 9b45b3e34c8af99246733ace165185eaa322d0b955764aefd7a36537020874e6: Status 404 returned error can't find the container with id 9b45b3e34c8af99246733ace165185eaa322d0b955764aefd7a36537020874e6 Oct 02 18:42:56 crc kubenswrapper[4909]: I1002 18:42:56.604642 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.243:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:56 crc kubenswrapper[4909]: I1002 18:42:56.604932 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.243:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:42:57 crc kubenswrapper[4909]: I1002 18:42:57.411347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jl4jd" event={"ID":"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e","Type":"ContainerStarted","Data":"fc554b813fc5457dfb32ffaf5674d6505d15b95445b6608809fadc3bd741baae"} Oct 02 18:42:57 crc kubenswrapper[4909]: I1002 18:42:57.411613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jl4jd" event={"ID":"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e","Type":"ContainerStarted","Data":"9b45b3e34c8af99246733ace165185eaa322d0b955764aefd7a36537020874e6"} Oct 02 18:42:57 crc kubenswrapper[4909]: I1002 18:42:57.433315 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jl4jd" podStartSLOduration=2.433264582 podStartE2EDuration="2.433264582s" podCreationTimestamp="2025-10-02 18:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:42:57.430631319 +0000 UTC m=+1498.618127178" watchObservedRunningTime="2025-10-02 18:42:57.433264582 +0000 UTC m=+1498.620760461" Oct 02 18:43:02 crc kubenswrapper[4909]: I1002 18:43:02.483759 4909 generic.go:334] "Generic (PLEG): container finished" podID="bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" containerID="fc554b813fc5457dfb32ffaf5674d6505d15b95445b6608809fadc3bd741baae" exitCode=0 Oct 02 18:43:02 crc kubenswrapper[4909]: I1002 18:43:02.483847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jl4jd" event={"ID":"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e","Type":"ContainerDied","Data":"fc554b813fc5457dfb32ffaf5674d6505d15b95445b6608809fadc3bd741baae"} Oct 02 18:43:03 crc kubenswrapper[4909]: I1002 18:43:03.498660 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerID="a823801d4bacac69a0d22aaea122bbc7ebe6170cd9197e9ffecb4dc97c9fb3b3" exitCode=137 Oct 02 18:43:03 crc kubenswrapper[4909]: I1002 18:43:03.498721 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerDied","Data":"a823801d4bacac69a0d22aaea122bbc7ebe6170cd9197e9ffecb4dc97c9fb3b3"} Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.143201 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.151735 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.324600 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-config-data\") pod \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.324705 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-combined-ca-bundle\") pod \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.324857 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-combined-ca-bundle\") pod \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.325609 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-scripts\") pod \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.325881 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p79v5\" (UniqueName: \"kubernetes.io/projected/6e3d6540-58f9-4213-b9d3-c1f645eb5107-kube-api-access-p79v5\") pod \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.326018 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2r28\" (UniqueName: \"kubernetes.io/projected/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-kube-api-access-q2r28\") pod \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.326673 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-config-data\") pod \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\" (UID: \"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.326779 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-scripts\") pod \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\" (UID: \"6e3d6540-58f9-4213-b9d3-c1f645eb5107\") " Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.334328 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3d6540-58f9-4213-b9d3-c1f645eb5107-kube-api-access-p79v5" (OuterVolumeSpecName: "kube-api-access-p79v5") pod "6e3d6540-58f9-4213-b9d3-c1f645eb5107" (UID: "6e3d6540-58f9-4213-b9d3-c1f645eb5107"). InnerVolumeSpecName "kube-api-access-p79v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.334602 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-kube-api-access-q2r28" (OuterVolumeSpecName: "kube-api-access-q2r28") pod "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" (UID: "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e"). InnerVolumeSpecName "kube-api-access-q2r28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.334885 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-scripts" (OuterVolumeSpecName: "scripts") pod "6e3d6540-58f9-4213-b9d3-c1f645eb5107" (UID: "6e3d6540-58f9-4213-b9d3-c1f645eb5107"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.338306 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-scripts" (OuterVolumeSpecName: "scripts") pod "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" (UID: "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.384575 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" (UID: "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.387867 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-config-data" (OuterVolumeSpecName: "config-data") pod "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" (UID: "bce99eed-6e71-4b26-9ea2-5f5cd06cda4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.430342 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p79v5\" (UniqueName: \"kubernetes.io/projected/6e3d6540-58f9-4213-b9d3-c1f645eb5107-kube-api-access-p79v5\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.430372 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2r28\" (UniqueName: \"kubernetes.io/projected/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-kube-api-access-q2r28\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.430384 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.430392 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.430401 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.430410 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.457239 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e3d6540-58f9-4213-b9d3-c1f645eb5107" (UID: "6e3d6540-58f9-4213-b9d3-c1f645eb5107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.472967 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-config-data" (OuterVolumeSpecName: "config-data") pod "6e3d6540-58f9-4213-b9d3-c1f645eb5107" (UID: "6e3d6540-58f9-4213-b9d3-c1f645eb5107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.511755 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jl4jd" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.512442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jl4jd" event={"ID":"bce99eed-6e71-4b26-9ea2-5f5cd06cda4e","Type":"ContainerDied","Data":"9b45b3e34c8af99246733ace165185eaa322d0b955764aefd7a36537020874e6"} Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.512532 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b45b3e34c8af99246733ace165185eaa322d0b955764aefd7a36537020874e6" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.514945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e3d6540-58f9-4213-b9d3-c1f645eb5107","Type":"ContainerDied","Data":"d23ecbdeecb734f2d58c1428a2259bad9ac12c4d85ddc5220236317c9aa12b4f"} Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.514993 4909 scope.go:117] "RemoveContainer" containerID="a823801d4bacac69a0d22aaea122bbc7ebe6170cd9197e9ffecb4dc97c9fb3b3" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.515201 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.536504 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.536735 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d6540-58f9-4213-b9d3-c1f645eb5107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.594634 4909 scope.go:117] "RemoveContainer" containerID="7141d4d0d7dd31d6f65a754876a75efc99d82d1576f3e612305f2dad94e3b0e3" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.624574 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.626983 4909 scope.go:117] "RemoveContainer" containerID="b1bdb5b0300b816d1a88c3e450e7b0ebe9ab507647a67389d98ca3444cf20cba" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.645056 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.649692 4909 scope.go:117] "RemoveContainer" containerID="027d03a71e93b7a88e39c9be9b9ebbfdf73e0fcd15095ff412c2f4a13f5e59fe" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.659467 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: E1002 18:43:04.659984 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-notifier" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660007 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-notifier" Oct 02 18:43:04 crc kubenswrapper[4909]: E1002 18:43:04.660071 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-api" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660081 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-api" Oct 02 18:43:04 crc kubenswrapper[4909]: E1002 18:43:04.660097 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-listener" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660116 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-listener" Oct 02 18:43:04 crc kubenswrapper[4909]: E1002 18:43:04.660138 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-evaluator" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660145 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-evaluator" Oct 02 18:43:04 crc kubenswrapper[4909]: E1002 18:43:04.660162 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" containerName="nova-manage" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660172 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" containerName="nova-manage" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660421 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-notifier" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660439 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-evaluator" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660451 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" containerName="nova-manage" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660474 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-api" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.660494 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" containerName="aodh-listener" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.662716 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.667433 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-t2tfr" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.667645 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.668292 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.668644 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.668968 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.682372 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.704812 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.705138 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-log" containerID="cri-o://13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8" gracePeriod=30 Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.705688 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-api" containerID="cri-o://853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d" gracePeriod=30 Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.726197 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.726550 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4a40adda-36e4-4145-a620-50792614e9e7" containerName="nova-scheduler-scheduler" containerID="cri-o://d578cad187e4f2ad5ed1dfa936c8c490625b602fbd1ef7dc9e39a63f912064fb" gracePeriod=30 Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.735153 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.735375 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-log" containerID="cri-o://892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957" gracePeriod=30 Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.735523 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-metadata" containerID="cri-o://0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38" gracePeriod=30 Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.743271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-scripts\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.743373 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-internal-tls-certs\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.743422 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-public-tls-certs\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.743478 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.743571 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8rp\" (UniqueName: \"kubernetes.io/projected/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-kube-api-access-xg8rp\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.743704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-config-data\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.846179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.846290 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8rp\" (UniqueName: \"kubernetes.io/projected/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-kube-api-access-xg8rp\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.846413 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-config-data\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.847314 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-scripts\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.847480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-internal-tls-certs\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.847568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-public-tls-certs\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.854535 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.855621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-scripts\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.855809 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-internal-tls-certs\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.856766 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-config-data\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.860253 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-public-tls-certs\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:04 crc kubenswrapper[4909]: I1002 18:43:04.866160 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8rp\" (UniqueName: \"kubernetes.io/projected/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-kube-api-access-xg8rp\") pod \"aodh-0\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " pod="openstack/aodh-0" Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.007427 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.532915 4909 generic.go:334] "Generic (PLEG): container finished" podID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerID="892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957" exitCode=143 Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.532986 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d","Type":"ContainerDied","Data":"892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957"} Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.539281 4909 generic.go:334] "Generic (PLEG): container finished" podID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerID="13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8" exitCode=143 Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.539310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cb85597-15db-43a3-9634-7dec6ecaf813","Type":"ContainerDied","Data":"13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8"} Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.577992 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 18:43:05 crc kubenswrapper[4909]: I1002 18:43:05.624396 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3d6540-58f9-4213-b9d3-c1f645eb5107" path="/var/lib/kubelet/pods/6e3d6540-58f9-4213-b9d3-c1f645eb5107/volumes" Oct 02 18:43:06 crc kubenswrapper[4909]: I1002 18:43:06.558984 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerStarted","Data":"f77258fa0c6be8faf2df03d25ab49d56117ae6cc49e5f1559ba5a282b6f3529d"} Oct 02 18:43:07 crc kubenswrapper[4909]: I1002 18:43:07.600570 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a40adda-36e4-4145-a620-50792614e9e7" containerID="d578cad187e4f2ad5ed1dfa936c8c490625b602fbd1ef7dc9e39a63f912064fb" exitCode=0 Oct 02 18:43:07 crc kubenswrapper[4909]: I1002 18:43:07.600579 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a40adda-36e4-4145-a620-50792614e9e7","Type":"ContainerDied","Data":"d578cad187e4f2ad5ed1dfa936c8c490625b602fbd1ef7dc9e39a63f912064fb"} Oct 02 18:43:07 crc kubenswrapper[4909]: I1002 18:43:07.897846 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.012364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hkc2\" (UniqueName: \"kubernetes.io/projected/4a40adda-36e4-4145-a620-50792614e9e7-kube-api-access-7hkc2\") pod \"4a40adda-36e4-4145-a620-50792614e9e7\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.012649 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-config-data\") pod \"4a40adda-36e4-4145-a620-50792614e9e7\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.012684 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-combined-ca-bundle\") pod \"4a40adda-36e4-4145-a620-50792614e9e7\" (UID: \"4a40adda-36e4-4145-a620-50792614e9e7\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.022971 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a40adda-36e4-4145-a620-50792614e9e7-kube-api-access-7hkc2" (OuterVolumeSpecName: "kube-api-access-7hkc2") pod "4a40adda-36e4-4145-a620-50792614e9e7" (UID: "4a40adda-36e4-4145-a620-50792614e9e7"). InnerVolumeSpecName "kube-api-access-7hkc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.052392 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-config-data" (OuterVolumeSpecName: "config-data") pod "4a40adda-36e4-4145-a620-50792614e9e7" (UID: "4a40adda-36e4-4145-a620-50792614e9e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.062667 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a40adda-36e4-4145-a620-50792614e9e7" (UID: "4a40adda-36e4-4145-a620-50792614e9e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.088845 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": dial tcp 10.217.0.233:8775: connect: connection refused" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.088850 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": dial tcp 10.217.0.233:8775: connect: connection refused" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.116111 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hkc2\" (UniqueName: \"kubernetes.io/projected/4a40adda-36e4-4145-a620-50792614e9e7-kube-api-access-7hkc2\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.116171 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.116190 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a40adda-36e4-4145-a620-50792614e9e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.566091 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.574264 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.617250 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerStarted","Data":"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601"} Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.623535 4909 generic.go:334] "Generic (PLEG): container finished" podID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerID="0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38" exitCode=0 Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.623618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d","Type":"ContainerDied","Data":"0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38"} Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.623653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d","Type":"ContainerDied","Data":"3d361289bfa64444327348f81d7a067be7ec8fdeea0e102e126e92bde2455dd2"} Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.623672 4909 scope.go:117] "RemoveContainer" containerID="0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.623801 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.629135 4909 generic.go:334] "Generic (PLEG): container finished" podID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerID="853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d" exitCode=0 Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.629216 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.629219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cb85597-15db-43a3-9634-7dec6ecaf813","Type":"ContainerDied","Data":"853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d"} Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.629335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cb85597-15db-43a3-9634-7dec6ecaf813","Type":"ContainerDied","Data":"ca7a7ab5099116915b4a32817fb0ca78831bd22d4dd713b4746fb69c22babb0a"} Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633177 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-combined-ca-bundle\") pod \"2cb85597-15db-43a3-9634-7dec6ecaf813\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633240 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xll\" (UniqueName: \"kubernetes.io/projected/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-kube-api-access-h6xll\") pod \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633307 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cb85597-15db-43a3-9634-7dec6ecaf813-logs\") pod \"2cb85597-15db-43a3-9634-7dec6ecaf813\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633346 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-combined-ca-bundle\") pod \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633399 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-nova-metadata-tls-certs\") pod \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633485 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-config-data\") pod \"2cb85597-15db-43a3-9634-7dec6ecaf813\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633530 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4j8w\" (UniqueName: \"kubernetes.io/projected/2cb85597-15db-43a3-9634-7dec6ecaf813-kube-api-access-k4j8w\") pod \"2cb85597-15db-43a3-9634-7dec6ecaf813\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633585 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-public-tls-certs\") pod \"2cb85597-15db-43a3-9634-7dec6ecaf813\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633617 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-logs\") pod \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633671 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-internal-tls-certs\") pod \"2cb85597-15db-43a3-9634-7dec6ecaf813\" (UID: \"2cb85597-15db-43a3-9634-7dec6ecaf813\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633755 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-config-data\") pod \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\" (UID: \"3c0aaf37-f800-46eb-a8d1-9cf16385ec5d\") " Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.633909 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cb85597-15db-43a3-9634-7dec6ecaf813-logs" (OuterVolumeSpecName: "logs") pod "2cb85597-15db-43a3-9634-7dec6ecaf813" (UID: "2cb85597-15db-43a3-9634-7dec6ecaf813"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.637064 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cb85597-15db-43a3-9634-7dec6ecaf813-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.637455 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-logs" (OuterVolumeSpecName: "logs") pod "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" (UID: "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.639604 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb85597-15db-43a3-9634-7dec6ecaf813-kube-api-access-k4j8w" (OuterVolumeSpecName: "kube-api-access-k4j8w") pod "2cb85597-15db-43a3-9634-7dec6ecaf813" (UID: "2cb85597-15db-43a3-9634-7dec6ecaf813"). InnerVolumeSpecName "kube-api-access-k4j8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.640949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a40adda-36e4-4145-a620-50792614e9e7","Type":"ContainerDied","Data":"3096a0385f49c7e9fc663131f36618d56417bcc703062dd787478bd74d815b0e"} Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.641068 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.642120 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-kube-api-access-h6xll" (OuterVolumeSpecName: "kube-api-access-h6xll") pod "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" (UID: "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d"). InnerVolumeSpecName "kube-api-access-h6xll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.680783 4909 scope.go:117] "RemoveContainer" containerID="892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.707484 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-config-data" (OuterVolumeSpecName: "config-data") pod "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" (UID: "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.714116 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-config-data" (OuterVolumeSpecName: "config-data") pod "2cb85597-15db-43a3-9634-7dec6ecaf813" (UID: "2cb85597-15db-43a3-9634-7dec6ecaf813"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.729999 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.734875 4909 scope.go:117] "RemoveContainer" containerID="0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.735847 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38\": container with ID starting with 0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38 not found: ID does not exist" containerID="0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.735896 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38"} err="failed to get container status \"0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38\": rpc error: code = NotFound desc = could not find container \"0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38\": container with ID starting with 0bd96510f4e4c7c8f0180532ab7cb4931511031adb7b200aab26fd9e9a2d5e38 not found: ID does not exist" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.735922 4909 scope.go:117] "RemoveContainer" containerID="892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.736173 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957\": container with ID starting with 892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957 not found: ID does not exist" containerID="892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.736193 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957"} err="failed to get container status \"892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957\": rpc error: code = NotFound desc = could not find container \"892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957\": container with ID starting with 892a5f87401b0bcb4a38f57e21f5a461721d089fdd9740979efb51b05a0af957 not found: ID does not exist" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.736209 4909 scope.go:117] "RemoveContainer" containerID="853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.738185 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.738221 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4j8w\" (UniqueName: \"kubernetes.io/projected/2cb85597-15db-43a3-9634-7dec6ecaf813-kube-api-access-k4j8w\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.738237 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-logs\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.738256 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.738271 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xll\" (UniqueName: \"kubernetes.io/projected/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-kube-api-access-h6xll\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.739447 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cb85597-15db-43a3-9634-7dec6ecaf813" (UID: "2cb85597-15db-43a3-9634-7dec6ecaf813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.745155 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.756179 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.756718 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a40adda-36e4-4145-a620-50792614e9e7" containerName="nova-scheduler-scheduler" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.756741 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a40adda-36e4-4145-a620-50792614e9e7" containerName="nova-scheduler-scheduler" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.756764 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-log" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.756773 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-log" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.756811 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-log" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.756820 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-log" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.756839 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-metadata" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.756847 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-metadata" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.756866 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-api" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.756873 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-api" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.757516 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a40adda-36e4-4145-a620-50792614e9e7" containerName="nova-scheduler-scheduler" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.757547 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-log" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.757562 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-metadata" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.757573 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" containerName="nova-api-api" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.757603 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" containerName="nova-metadata-log" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.758727 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.760650 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.762628 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" (UID: "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.769615 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.774155 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2cb85597-15db-43a3-9634-7dec6ecaf813" (UID: "2cb85597-15db-43a3-9634-7dec6ecaf813"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.795570 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2cb85597-15db-43a3-9634-7dec6ecaf813" (UID: "2cb85597-15db-43a3-9634-7dec6ecaf813"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.799008 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" (UID: "3c0aaf37-f800-46eb-a8d1-9cf16385ec5d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.810334 4909 scope.go:117] "RemoveContainer" containerID="13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.836553 4909 scope.go:117] "RemoveContainer" containerID="853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.839072 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d\": container with ID starting with 853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d not found: ID does not exist" containerID="853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.839109 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d"} err="failed to get container status \"853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d\": rpc error: code = NotFound desc = could not find container \"853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d\": container with ID starting with 853a507f2e36c4e12bd9a0d40e28df37376531ab41b26e5ce29f862c7a98446d not found: ID does not exist" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.839134 4909 scope.go:117] "RemoveContainer" containerID="13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.842993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6zs\" (UniqueName: \"kubernetes.io/projected/928ab855-d71e-48e2-bbae-e872154de8bf-kube-api-access-qv6zs\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928ab855-d71e-48e2-bbae-e872154de8bf-config-data\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843579 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928ab855-d71e-48e2-bbae-e872154de8bf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: E1002 18:43:08.843825 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8\": container with ID starting with 13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8 not found: ID does not exist" containerID="13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843867 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8"} err="failed to get container status \"13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8\": rpc error: code = NotFound desc = could not find container \"13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8\": container with ID starting with 13b64617b3ee60d6167ffc87b938ebe9389314d155b4f6f1d37bf1ae7f7b88f8 not found: ID does not exist" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843893 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843902 4909 scope.go:117] "RemoveContainer" containerID="d578cad187e4f2ad5ed1dfa936c8c490625b602fbd1ef7dc9e39a63f912064fb" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843908 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843918 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843933 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.843981 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cb85597-15db-43a3-9634-7dec6ecaf813-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.945157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928ab855-d71e-48e2-bbae-e872154de8bf-config-data\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.945274 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928ab855-d71e-48e2-bbae-e872154de8bf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.945357 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6zs\" (UniqueName: \"kubernetes.io/projected/928ab855-d71e-48e2-bbae-e872154de8bf-kube-api-access-qv6zs\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.950412 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/928ab855-d71e-48e2-bbae-e872154de8bf-config-data\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.951171 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928ab855-d71e-48e2-bbae-e872154de8bf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.963307 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6zs\" (UniqueName: \"kubernetes.io/projected/928ab855-d71e-48e2-bbae-e872154de8bf-kube-api-access-qv6zs\") pod \"nova-scheduler-0\" (UID: \"928ab855-d71e-48e2-bbae-e872154de8bf\") " pod="openstack/nova-scheduler-0" Oct 02 18:43:08 crc kubenswrapper[4909]: I1002 18:43:08.992098 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.002468 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.012391 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.020446 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.031636 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.033473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.036365 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.036572 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.036730 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.044959 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.046590 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.047554 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.047603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.047644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-config-data\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.047687 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.047703 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswc6\" (UniqueName: \"kubernetes.io/projected/080b267d-f77b-4b0e-ab73-326a3c9b67b9-kube-api-access-dswc6\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.047738 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/080b267d-f77b-4b0e-ab73-326a3c9b67b9-logs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.051329 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.051486 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.053174 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.064207 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.103752 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149662 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149708 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dswc6\" (UniqueName: \"kubernetes.io/projected/080b267d-f77b-4b0e-ab73-326a3c9b67b9-kube-api-access-dswc6\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149746 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-config-data\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149784 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlvff\" (UniqueName: \"kubernetes.io/projected/55be370d-35f9-4114-9fd5-48a0b939125a-kube-api-access-tlvff\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149814 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55be370d-35f9-4114-9fd5-48a0b939125a-logs\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/080b267d-f77b-4b0e-ab73-326a3c9b67b9-logs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.149995 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.150075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-config-data\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.151115 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/080b267d-f77b-4b0e-ab73-326a3c9b67b9-logs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.160286 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.160692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.160952 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-config-data\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.161194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080b267d-f77b-4b0e-ab73-326a3c9b67b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.173525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswc6\" (UniqueName: \"kubernetes.io/projected/080b267d-f77b-4b0e-ab73-326a3c9b67b9-kube-api-access-dswc6\") pod \"nova-api-0\" (UID: \"080b267d-f77b-4b0e-ab73-326a3c9b67b9\") " pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.251322 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-config-data\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.251372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.251396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlvff\" (UniqueName: \"kubernetes.io/projected/55be370d-35f9-4114-9fd5-48a0b939125a-kube-api-access-tlvff\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.251429 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55be370d-35f9-4114-9fd5-48a0b939125a-logs\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.251564 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.252583 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55be370d-35f9-4114-9fd5-48a0b939125a-logs\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.258565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-config-data\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.258909 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.259115 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55be370d-35f9-4114-9fd5-48a0b939125a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.269603 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlvff\" (UniqueName: \"kubernetes.io/projected/55be370d-35f9-4114-9fd5-48a0b939125a-kube-api-access-tlvff\") pod \"nova-metadata-0\" (UID: \"55be370d-35f9-4114-9fd5-48a0b939125a\") " pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.365341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.373665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.586199 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.624952 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb85597-15db-43a3-9634-7dec6ecaf813" path="/var/lib/kubelet/pods/2cb85597-15db-43a3-9634-7dec6ecaf813/volumes" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.625916 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0aaf37-f800-46eb-a8d1-9cf16385ec5d" path="/var/lib/kubelet/pods/3c0aaf37-f800-46eb-a8d1-9cf16385ec5d/volumes" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.627129 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a40adda-36e4-4145-a620-50792614e9e7" path="/var/lib/kubelet/pods/4a40adda-36e4-4145-a620-50792614e9e7/volumes" Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.664039 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerStarted","Data":"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825"} Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.670650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"928ab855-d71e-48e2-bbae-e872154de8bf","Type":"ContainerStarted","Data":"9aabe8b9499a79a21ec711d71a70449014985e965a0549a67a335eed7b9ab60c"} Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.848043 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: W1002 18:43:09.850347 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod080b267d_f77b_4b0e_ab73_326a3c9b67b9.slice/crio-169b822e742bb47b07d8fa6bdaf432b2e97816866bbab59cf0b26b8832a57221 WatchSource:0}: Error finding container 169b822e742bb47b07d8fa6bdaf432b2e97816866bbab59cf0b26b8832a57221: Status 404 returned error can't find the container with id 169b822e742bb47b07d8fa6bdaf432b2e97816866bbab59cf0b26b8832a57221 Oct 02 18:43:09 crc kubenswrapper[4909]: I1002 18:43:09.924055 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 18:43:09 crc kubenswrapper[4909]: W1002 18:43:09.936727 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55be370d_35f9_4114_9fd5_48a0b939125a.slice/crio-01d74397c8990e0ac0124b5dfa827e6105ca73134e758f13903460dd8fcbaeee WatchSource:0}: Error finding container 01d74397c8990e0ac0124b5dfa827e6105ca73134e758f13903460dd8fcbaeee: Status 404 returned error can't find the container with id 01d74397c8990e0ac0124b5dfa827e6105ca73134e758f13903460dd8fcbaeee Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.704960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55be370d-35f9-4114-9fd5-48a0b939125a","Type":"ContainerStarted","Data":"662c4bede55f4b51773dffa506188986b1f555839749d40436300e457f49d73f"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.705313 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55be370d-35f9-4114-9fd5-48a0b939125a","Type":"ContainerStarted","Data":"3596806839b1abe3c1389caf1d8cf54d934d03aef0499642f9e415630fc02a9f"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.705332 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"55be370d-35f9-4114-9fd5-48a0b939125a","Type":"ContainerStarted","Data":"01d74397c8990e0ac0124b5dfa827e6105ca73134e758f13903460dd8fcbaeee"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.708624 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"928ab855-d71e-48e2-bbae-e872154de8bf","Type":"ContainerStarted","Data":"3d4be4c408db13f9f76430449a8e8559807a76dc82c279f6d63f1fbfc2a362f0"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.716687 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"080b267d-f77b-4b0e-ab73-326a3c9b67b9","Type":"ContainerStarted","Data":"9b8b5d03f0c298222f3276bf436d957ac0b7bb48e8f4ffcf78e9171d56bfb4e5"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.716727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"080b267d-f77b-4b0e-ab73-326a3c9b67b9","Type":"ContainerStarted","Data":"59eaff8de19d46695fc7224d8d21a52744597d938c1be3bac012482d353140a5"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.716737 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"080b267d-f77b-4b0e-ab73-326a3c9b67b9","Type":"ContainerStarted","Data":"169b822e742bb47b07d8fa6bdaf432b2e97816866bbab59cf0b26b8832a57221"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.725829 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.72581551 podStartE2EDuration="2.72581551s" podCreationTimestamp="2025-10-02 18:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:43:10.723498067 +0000 UTC m=+1511.910993926" watchObservedRunningTime="2025-10-02 18:43:10.72581551 +0000 UTC m=+1511.913311369" Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.729473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerStarted","Data":"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd"} Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.755505 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.755489164 podStartE2EDuration="2.755489164s" podCreationTimestamp="2025-10-02 18:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:43:10.740984168 +0000 UTC m=+1511.928480027" watchObservedRunningTime="2025-10-02 18:43:10.755489164 +0000 UTC m=+1511.942985023" Oct 02 18:43:10 crc kubenswrapper[4909]: I1002 18:43:10.772494 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7724783 podStartE2EDuration="2.7724783s" podCreationTimestamp="2025-10-02 18:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:43:10.760281636 +0000 UTC m=+1511.947777495" watchObservedRunningTime="2025-10-02 18:43:10.7724783 +0000 UTC m=+1511.959974159" Oct 02 18:43:11 crc kubenswrapper[4909]: I1002 18:43:11.741000 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerStarted","Data":"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38"} Oct 02 18:43:11 crc kubenswrapper[4909]: I1002 18:43:11.762540 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.838119902 podStartE2EDuration="7.76252297s" podCreationTimestamp="2025-10-02 18:43:04 +0000 UTC" firstStartedPulling="2025-10-02 18:43:05.567528574 +0000 UTC m=+1506.755024433" lastFinishedPulling="2025-10-02 18:43:10.491931642 +0000 UTC m=+1511.679427501" observedRunningTime="2025-10-02 18:43:11.759900898 +0000 UTC m=+1512.947396787" watchObservedRunningTime="2025-10-02 18:43:11.76252297 +0000 UTC m=+1512.950018829" Oct 02 18:43:14 crc kubenswrapper[4909]: I1002 18:43:14.105066 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 18:43:14 crc kubenswrapper[4909]: I1002 18:43:14.374243 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 18:43:14 crc kubenswrapper[4909]: I1002 18:43:14.374620 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 18:43:17 crc kubenswrapper[4909]: I1002 18:43:17.931886 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.104754 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.141386 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.366594 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.366676 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.374976 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.375059 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 18:43:19 crc kubenswrapper[4909]: I1002 18:43:19.881613 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 18:43:20 crc kubenswrapper[4909]: I1002 18:43:20.381212 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="080b267d-f77b-4b0e-ab73-326a3c9b67b9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.248:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:43:20 crc kubenswrapper[4909]: I1002 18:43:20.381207 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="080b267d-f77b-4b0e-ab73-326a3c9b67b9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.248:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:43:20 crc kubenswrapper[4909]: I1002 18:43:20.396307 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="55be370d-35f9-4114-9fd5-48a0b939125a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:43:20 crc kubenswrapper[4909]: I1002 18:43:20.396307 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="55be370d-35f9-4114-9fd5-48a0b939125a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 18:43:23 crc kubenswrapper[4909]: I1002 18:43:23.054632 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:43:23 crc kubenswrapper[4909]: I1002 18:43:23.055228 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.376769 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.378429 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.380914 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.386810 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.390020 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.390630 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.394924 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.964543 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.969588 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 18:43:29 crc kubenswrapper[4909]: I1002 18:43:29.970561 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 18:43:39 crc kubenswrapper[4909]: I1002 18:43:39.918708 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-f8b6m"] Oct 02 18:43:39 crc kubenswrapper[4909]: I1002 18:43:39.928550 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-f8b6m"] Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.068768 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-hfpp7"] Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.070527 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.082468 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-hfpp7"] Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.092046 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-config-data\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.092370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-combined-ca-bundle\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.092571 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6kqf\" (UniqueName: \"kubernetes.io/projected/337abd66-4416-4bbb-9ee9-29e704fabd94-kube-api-access-t6kqf\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.196541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-combined-ca-bundle\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.196950 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6kqf\" (UniqueName: \"kubernetes.io/projected/337abd66-4416-4bbb-9ee9-29e704fabd94-kube-api-access-t6kqf\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.197531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-config-data\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.204789 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-config-data\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.208988 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-combined-ca-bundle\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.213622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6kqf\" (UniqueName: \"kubernetes.io/projected/337abd66-4416-4bbb-9ee9-29e704fabd94-kube-api-access-t6kqf\") pod \"heat-db-sync-hfpp7\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.410430 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hfpp7" Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.921624 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-hfpp7"] Oct 02 18:43:40 crc kubenswrapper[4909]: I1002 18:43:40.983809 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:43:41 crc kubenswrapper[4909]: I1002 18:43:41.143611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hfpp7" event={"ID":"337abd66-4416-4bbb-9ee9-29e704fabd94","Type":"ContainerStarted","Data":"3aadb5bce0148adb03934279d0f6e326bf9f38b7028da9128cc94f204f1899a9"} Oct 02 18:43:41 crc kubenswrapper[4909]: I1002 18:43:41.634901 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f47e10-7e95-4d88-ae97-71b020828eaa" path="/var/lib/kubelet/pods/e7f47e10-7e95-4d88-ae97-71b020828eaa/volumes" Oct 02 18:43:42 crc kubenswrapper[4909]: I1002 18:43:42.028866 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:43:42 crc kubenswrapper[4909]: I1002 18:43:42.135680 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:43:42 crc kubenswrapper[4909]: I1002 18:43:42.135975 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-central-agent" containerID="cri-o://31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb" gracePeriod=30 Oct 02 18:43:42 crc kubenswrapper[4909]: I1002 18:43:42.136397 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="proxy-httpd" containerID="cri-o://10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09" gracePeriod=30 Oct 02 18:43:42 crc kubenswrapper[4909]: I1002 18:43:42.136446 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="sg-core" containerID="cri-o://eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d" gracePeriod=30 Oct 02 18:43:42 crc kubenswrapper[4909]: I1002 18:43:42.136477 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-notification-agent" containerID="cri-o://0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892" gracePeriod=30 Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.160285 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.182034 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerID="10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09" exitCode=0 Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.182064 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerID="eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d" exitCode=2 Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.182073 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerID="31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb" exitCode=0 Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.182062 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerDied","Data":"10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09"} Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.182111 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerDied","Data":"eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d"} Oct 02 18:43:43 crc kubenswrapper[4909]: I1002 18:43:43.182124 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerDied","Data":"31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb"} Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.129679 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.197047 4909 generic.go:334] "Generic (PLEG): container finished" podID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerID="0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892" exitCode=0 Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.197088 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerDied","Data":"0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892"} Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.197118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e","Type":"ContainerDied","Data":"8ab7ad505111519821e67091f6fdf0a639ec85ef7d1759ff0ad6732ee08e1293"} Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.197133 4909 scope.go:117] "RemoveContainer" containerID="10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.197249 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.243285 4909 scope.go:117] "RemoveContainer" containerID="eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.284524 4909 scope.go:117] "RemoveContainer" containerID="0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.289614 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz48j\" (UniqueName: \"kubernetes.io/projected/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-kube-api-access-mz48j\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.289726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-run-httpd\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.289798 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-combined-ca-bundle\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.289831 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-scripts\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.289893 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-log-httpd\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.289915 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-config-data\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.290006 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-ceilometer-tls-certs\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.290021 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-sg-core-conf-yaml\") pod \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\" (UID: \"e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e\") " Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.290348 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.290540 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.290837 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.315733 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-kube-api-access-mz48j" (OuterVolumeSpecName: "kube-api-access-mz48j") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "kube-api-access-mz48j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.315881 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-scripts" (OuterVolumeSpecName: "scripts") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.349570 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.357299 4909 scope.go:117] "RemoveContainer" containerID="31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.363880 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.392173 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.392200 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.392210 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.392221 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.392229 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz48j\" (UniqueName: \"kubernetes.io/projected/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-kube-api-access-mz48j\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.434447 4909 scope.go:117] "RemoveContainer" containerID="10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.437281 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09\": container with ID starting with 10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09 not found: ID does not exist" containerID="10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.437318 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09"} err="failed to get container status \"10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09\": rpc error: code = NotFound desc = could not find container \"10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09\": container with ID starting with 10a9be6dfd4a469488e87b77a62f97011e5e36679d15c614cd60d383cb87fb09 not found: ID does not exist" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.437342 4909 scope.go:117] "RemoveContainer" containerID="eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.438097 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d\": container with ID starting with eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d not found: ID does not exist" containerID="eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.438137 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d"} err="failed to get container status \"eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d\": rpc error: code = NotFound desc = could not find container \"eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d\": container with ID starting with eed02cb64bdb8be245d0d7a107aba4df9918d329098ef57c1c7c01c0cbec516d not found: ID does not exist" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.438164 4909 scope.go:117] "RemoveContainer" containerID="0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.438757 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892\": container with ID starting with 0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892 not found: ID does not exist" containerID="0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.438785 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892"} err="failed to get container status \"0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892\": rpc error: code = NotFound desc = could not find container \"0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892\": container with ID starting with 0b7e2b28b32e51e2f3304f8e4754c7d636d09d171f60f8bdfcddd2ef69251892 not found: ID does not exist" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.438804 4909 scope.go:117] "RemoveContainer" containerID="31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.439076 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb\": container with ID starting with 31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb not found: ID does not exist" containerID="31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.439106 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb"} err="failed to get container status \"31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb\": rpc error: code = NotFound desc = could not find container \"31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb\": container with ID starting with 31ef4ba2971e956b44e6b88a3e28776772ea57653035a0ce31f6a38c9d4631bb not found: ID does not exist" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.439216 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-config-data" (OuterVolumeSpecName: "config-data") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.464157 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" (UID: "e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.494256 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.494285 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.548258 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.557780 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.569543 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.570049 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-notification-agent" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570066 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-notification-agent" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.570088 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-central-agent" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570094 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-central-agent" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.570112 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="proxy-httpd" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570118 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="proxy-httpd" Oct 02 18:43:44 crc kubenswrapper[4909]: E1002 18:43:44.570127 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="sg-core" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570134 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="sg-core" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570342 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="sg-core" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570360 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-central-agent" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570368 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="proxy-httpd" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.570383 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" containerName="ceilometer-notification-agent" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.572211 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.582439 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.582668 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.582756 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.585495 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.698770 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.698891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.698969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-run-httpd\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.699087 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-config-data\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.699212 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-scripts\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.699269 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.699370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-log-httpd\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.699439 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnr5h\" (UniqueName: \"kubernetes.io/projected/555cac57-752b-43f7-80f6-2d768759cad4-kube-api-access-nnr5h\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.800821 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnr5h\" (UniqueName: \"kubernetes.io/projected/555cac57-752b-43f7-80f6-2d768759cad4-kube-api-access-nnr5h\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-run-httpd\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801181 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-config-data\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801222 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-scripts\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801247 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801294 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-log-httpd\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.801703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-log-httpd\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.802109 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-run-httpd\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.805509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-config-data\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.808531 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-scripts\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.808979 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.812618 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.817226 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.818487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnr5h\" (UniqueName: \"kubernetes.io/projected/555cac57-752b-43f7-80f6-2d768759cad4-kube-api-access-nnr5h\") pod \"ceilometer-0\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " pod="openstack/ceilometer-0" Oct 02 18:43:44 crc kubenswrapper[4909]: I1002 18:43:44.893082 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 18:43:45 crc kubenswrapper[4909]: W1002 18:43:45.384582 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod555cac57_752b_43f7_80f6_2d768759cad4.slice/crio-da77ab77fadc5aef99d136ef3706bec6027dac72b8a11a654b70f48c9e82ca27 WatchSource:0}: Error finding container da77ab77fadc5aef99d136ef3706bec6027dac72b8a11a654b70f48c9e82ca27: Status 404 returned error can't find the container with id da77ab77fadc5aef99d136ef3706bec6027dac72b8a11a654b70f48c9e82ca27 Oct 02 18:43:45 crc kubenswrapper[4909]: I1002 18:43:45.386019 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 18:43:45 crc kubenswrapper[4909]: I1002 18:43:45.621354 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e" path="/var/lib/kubelet/pods/e1017cdd-5bff-4c66-a5fb-5a2e0c1a683e/volumes" Oct 02 18:43:46 crc kubenswrapper[4909]: I1002 18:43:46.225805 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerStarted","Data":"da77ab77fadc5aef99d136ef3706bec6027dac72b8a11a654b70f48c9e82ca27"} Oct 02 18:43:46 crc kubenswrapper[4909]: I1002 18:43:46.962838 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="rabbitmq" containerID="cri-o://15b88cbc58eb05c48efebf39a96fe5676c9884e5c277e9588c948540649bfcf2" gracePeriod=604796 Oct 02 18:43:47 crc kubenswrapper[4909]: I1002 18:43:47.514364 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="rabbitmq" containerID="cri-o://670db3e1871ba87e849dc034f28ebc741ee8f76e51481d6a9a67e91fe691d88b" gracePeriod=604796 Oct 02 18:43:47 crc kubenswrapper[4909]: I1002 18:43:47.556572 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.123:5671: connect: connection refused" Oct 02 18:43:47 crc kubenswrapper[4909]: I1002 18:43:47.871538 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.124:5671: connect: connection refused" Oct 02 18:43:53 crc kubenswrapper[4909]: I1002 18:43:53.054795 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:43:53 crc kubenswrapper[4909]: I1002 18:43:53.055407 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:43:54 crc kubenswrapper[4909]: I1002 18:43:54.320470 4909 generic.go:334] "Generic (PLEG): container finished" podID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerID="15b88cbc58eb05c48efebf39a96fe5676c9884e5c277e9588c948540649bfcf2" exitCode=0 Oct 02 18:43:54 crc kubenswrapper[4909]: I1002 18:43:54.320533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a0396bfb-ab96-4eb9-af72-e3597ca74ca4","Type":"ContainerDied","Data":"15b88cbc58eb05c48efebf39a96fe5676c9884e5c277e9588c948540649bfcf2"} Oct 02 18:43:54 crc kubenswrapper[4909]: I1002 18:43:54.324742 4909 generic.go:334] "Generic (PLEG): container finished" podID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerID="670db3e1871ba87e849dc034f28ebc741ee8f76e51481d6a9a67e91fe691d88b" exitCode=0 Oct 02 18:43:54 crc kubenswrapper[4909]: I1002 18:43:54.324803 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d73c266b-a3db-431b-a40f-f0a5b9d06610","Type":"ContainerDied","Data":"670db3e1871ba87e849dc034f28ebc741ee8f76e51481d6a9a67e91fe691d88b"} Oct 02 18:43:57 crc kubenswrapper[4909]: I1002 18:43:57.556230 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.123:5671: connect: connection refused" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.491624 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68df85789f-zrqs2"] Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.495004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.500829 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.555239 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-zrqs2"] Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657192 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657242 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-svc\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-config\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657444 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssfjh\" (UniqueName: \"kubernetes.io/projected/6d09b7b4-4a61-4c72-982f-635c697c12af-kube-api-access-ssfjh\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.657623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763196 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763526 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-svc\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-config\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763884 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssfjh\" (UniqueName: \"kubernetes.io/projected/6d09b7b4-4a61-4c72-982f-635c697c12af-kube-api-access-ssfjh\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.763933 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.764186 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.765315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.765604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-svc\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.765981 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-config\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.766693 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.767829 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.787882 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssfjh\" (UniqueName: \"kubernetes.io/projected/6d09b7b4-4a61-4c72-982f-635c697c12af-kube-api-access-ssfjh\") pod \"dnsmasq-dns-68df85789f-zrqs2\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:00 crc kubenswrapper[4909]: I1002 18:44:00.869952 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:02 crc kubenswrapper[4909]: I1002 18:44:02.872212 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.124:5671: i/o timeout" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.183577 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.337975 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-plugins\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-pod-info\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-plugins-conf\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338188 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-config-data\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6rwb\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-kube-api-access-t6rwb\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338268 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338352 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-tls\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338377 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-confd\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338394 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-erlang-cookie\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338415 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-erlang-cookie-secret\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.338453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-server-conf\") pod \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\" (UID: \"a0396bfb-ab96-4eb9-af72-e3597ca74ca4\") " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.345347 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-kube-api-access-t6rwb" (OuterVolumeSpecName: "kube-api-access-t6rwb") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "kube-api-access-t6rwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.357262 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-pod-info" (OuterVolumeSpecName: "pod-info") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.361299 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.361322 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.367768 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.369228 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.407859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.421460 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.432096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-config-data" (OuterVolumeSpecName: "config-data") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.436192 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-server-conf" (OuterVolumeSpecName: "server-conf") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441527 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441557 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441569 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441578 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441586 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441595 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441603 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441612 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441620 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6rwb\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-kube-api-access-t6rwb\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.441641 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.467733 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.486324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a0396bfb-ab96-4eb9-af72-e3597ca74ca4","Type":"ContainerDied","Data":"33584b1bdd64ae17404374ab0cf3a532de1f065a14d35bedd1e3afa6e69f48cf"} Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.486385 4909 scope.go:117] "RemoveContainer" containerID="15b88cbc58eb05c48efebf39a96fe5676c9884e5c277e9588c948540649bfcf2" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.486530 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.549623 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.578630 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a0396bfb-ab96-4eb9-af72-e3597ca74ca4" (UID: "a0396bfb-ab96-4eb9-af72-e3597ca74ca4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.652729 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0396bfb-ab96-4eb9-af72-e3597ca74ca4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.849626 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.899796 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.911963 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:44:04 crc kubenswrapper[4909]: E1002 18:44:04.912385 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="rabbitmq" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.912397 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="rabbitmq" Oct 02 18:44:04 crc kubenswrapper[4909]: E1002 18:44:04.912428 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="setup-container" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.912434 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="setup-container" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.912641 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" containerName="rabbitmq" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.913916 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.917863 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.917887 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-f6kjx" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.918159 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.918274 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.919242 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.919848 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.921121 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 18:44:04 crc kubenswrapper[4909]: I1002 18:44:04.943571 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.064746 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.064816 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.064848 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc2c3682-578f-4f96-a535-d35eb31303c6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.064875 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc2c3682-578f-4f96-a535-d35eb31303c6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065053 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-config-data\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lx6h\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-kube-api-access-5lx6h\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065502 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.065613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167410 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc2c3682-578f-4f96-a535-d35eb31303c6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc2c3682-578f-4f96-a535-d35eb31303c6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-config-data\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167610 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lx6h\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-kube-api-access-5lx6h\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.167653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.168105 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.169023 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.172143 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.173265 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.175423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.175638 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.178156 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc2c3682-578f-4f96-a535-d35eb31303c6-config-data\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.178835 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.179661 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc2c3682-578f-4f96-a535-d35eb31303c6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.183426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc2c3682-578f-4f96-a535-d35eb31303c6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.203457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lx6h\" (UniqueName: \"kubernetes.io/projected/fc2c3682-578f-4f96-a535-d35eb31303c6-kube-api-access-5lx6h\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.222860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"fc2c3682-578f-4f96-a535-d35eb31303c6\") " pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.237244 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 18:44:05 crc kubenswrapper[4909]: E1002 18:44:05.463563 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Oct 02 18:44:05 crc kubenswrapper[4909]: E1002 18:44:05.463615 4909 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Oct 02 18:44:05 crc kubenswrapper[4909]: E1002 18:44:05.463735 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bdhbbh7bh66h568h8bhb8h5b5h99h54dh65dh94hcdh65dhf8h58h647hdfhdbh697h64ch95h5d4h697h9fh65fh5dchc4h9dh5dbh584h59bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnr5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(555cac57-752b-43f7-80f6-2d768759cad4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:44:05 crc kubenswrapper[4909]: I1002 18:44:05.624910 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0396bfb-ab96-4eb9-af72-e3597ca74ca4" path="/var/lib/kubelet/pods/a0396bfb-ab96-4eb9-af72-e3597ca74ca4/volumes" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.022675 4909 scope.go:117] "RemoveContainer" containerID="ba73d188de58df3dffa8c592fb175de9e46bf125cd7abfcd3881fb87ceadff99" Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.134375 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.134604 4909 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.134762 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6kqf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-hfpp7_openstack(337abd66-4416-4bbb-9ee9-29e704fabd94): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.136003 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-hfpp7" podUID="337abd66-4416-4bbb-9ee9-29e704fabd94" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.207616 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.296684 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.296813 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-erlang-cookie\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.296883 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfqlx\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-kube-api-access-mfqlx\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.296902 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-config-data\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.296932 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d73c266b-a3db-431b-a40f-f0a5b9d06610-pod-info\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.296948 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-tls\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.297018 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d73c266b-a3db-431b-a40f-f0a5b9d06610-erlang-cookie-secret\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.297071 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-server-conf\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.297107 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-plugins-conf\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.297129 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-plugins\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.297167 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-confd\") pod \"d73c266b-a3db-431b-a40f-f0a5b9d06610\" (UID: \"d73c266b-a3db-431b-a40f-f0a5b9d06610\") " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.306539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.308883 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.311600 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d73c266b-a3db-431b-a40f-f0a5b9d06610-pod-info" (OuterVolumeSpecName: "pod-info") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.311603 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-kube-api-access-mfqlx" (OuterVolumeSpecName: "kube-api-access-mfqlx") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "kube-api-access-mfqlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.311790 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.312492 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73c266b-a3db-431b-a40f-f0a5b9d06610-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.323858 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.331573 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.349527 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-config-data" (OuterVolumeSpecName: "config-data") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.366963 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-server-conf" (OuterVolumeSpecName: "server-conf") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400507 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfqlx\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-kube-api-access-mfqlx\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400551 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400569 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d73c266b-a3db-431b-a40f-f0a5b9d06610-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400586 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400596 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d73c266b-a3db-431b-a40f-f0a5b9d06610-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400607 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400617 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d73c266b-a3db-431b-a40f-f0a5b9d06610-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400627 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400661 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.400674 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.418877 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d73c266b-a3db-431b-a40f-f0a5b9d06610" (UID: "d73c266b-a3db-431b-a40f-f0a5b9d06610"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.431619 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.502328 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d73c266b-a3db-431b-a40f-f0a5b9d06610-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.502371 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.508652 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-zrqs2"] Oct 02 18:44:06 crc kubenswrapper[4909]: W1002 18:44:06.517160 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2c3682_578f_4f96_a535_d35eb31303c6.slice/crio-9037675193cf2cf21e82e630cac8e614bbd97105c2f3f580e1061cda48b3361a WatchSource:0}: Error finding container 9037675193cf2cf21e82e630cac8e614bbd97105c2f3f580e1061cda48b3361a: Status 404 returned error can't find the container with id 9037675193cf2cf21e82e630cac8e614bbd97105c2f3f580e1061cda48b3361a Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.518936 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.520516 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.520720 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d73c266b-a3db-431b-a40f-f0a5b9d06610","Type":"ContainerDied","Data":"b5af4f85b08d4df859e6830fb07a7025e820902e0936214d7f745d7593c570ae"} Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.520770 4909 scope.go:117] "RemoveContainer" containerID="670db3e1871ba87e849dc034f28ebc741ee8f76e51481d6a9a67e91fe691d88b" Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.521345 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-hfpp7" podUID="337abd66-4416-4bbb-9ee9-29e704fabd94" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.563227 4909 scope.go:117] "RemoveContainer" containerID="f418b687441436b3a07b048f6faffb535085d2fc78e0e54a8616c84206f2f045" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.575387 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.601597 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.626601 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.627004 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="rabbitmq" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.627033 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="rabbitmq" Oct 02 18:44:06 crc kubenswrapper[4909]: E1002 18:44:06.627054 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="setup-container" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.627061 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="setup-container" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.627280 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" containerName="rabbitmq" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.628321 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.629929 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.630089 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.632303 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.632451 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.634443 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.634685 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xdbxc" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.634846 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.637418 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.706646 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpb7h\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-kube-api-access-xpb7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.706708 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.706800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.706845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f627398-76ee-40f8-9c82-47cc58ecb013-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.706868 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.706885 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f627398-76ee-40f8-9c82-47cc58ecb013-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.707158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.707192 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.707240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.707267 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.707289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.810262 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.810648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.810780 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.810903 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.811013 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.811162 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpb7h\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-kube-api-access-xpb7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.811252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.811746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.811662 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.811992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.812180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.812327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f627398-76ee-40f8-9c82-47cc58ecb013-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.812439 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.812583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f627398-76ee-40f8-9c82-47cc58ecb013-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.812240 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.812216 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f627398-76ee-40f8-9c82-47cc58ecb013-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.814106 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.817449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f627398-76ee-40f8-9c82-47cc58ecb013-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.817950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f627398-76ee-40f8-9c82-47cc58ecb013-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.818453 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.819149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.832729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpb7h\" (UniqueName: \"kubernetes.io/projected/5f627398-76ee-40f8-9c82-47cc58ecb013-kube-api-access-xpb7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.869643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f627398-76ee-40f8-9c82-47cc58ecb013\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:06 crc kubenswrapper[4909]: I1002 18:44:06.947634 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.539097 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerStarted","Data":"4fd3acd9964168038d5cd23fb6ce81e8dcc3ce147ef98d00e703f8cb9484ed5d"} Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.541522 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc2c3682-578f-4f96-a535-d35eb31303c6","Type":"ContainerStarted","Data":"9037675193cf2cf21e82e630cac8e614bbd97105c2f3f580e1061cda48b3361a"} Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.544091 4909 generic.go:334] "Generic (PLEG): container finished" podID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerID="30f4ca4f34045eb0a9ff4db2ce98ee2dda0d05dc0bb8bfe545970a0608e3c29d" exitCode=0 Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.544176 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" event={"ID":"6d09b7b4-4a61-4c72-982f-635c697c12af","Type":"ContainerDied","Data":"30f4ca4f34045eb0a9ff4db2ce98ee2dda0d05dc0bb8bfe545970a0608e3c29d"} Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.544210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" event={"ID":"6d09b7b4-4a61-4c72-982f-635c697c12af","Type":"ContainerStarted","Data":"6c57c55be2bca5dc0b2e29caf586fd4ec511902ebaffea55e3abbf3930259d4c"} Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.582504 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 18:44:07 crc kubenswrapper[4909]: W1002 18:44:07.592977 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f627398_76ee_40f8_9c82_47cc58ecb013.slice/crio-5a718069fe44e98f663307ac12f808240ca846b4504ffc53d35d43afa68498a3 WatchSource:0}: Error finding container 5a718069fe44e98f663307ac12f808240ca846b4504ffc53d35d43afa68498a3: Status 404 returned error can't find the container with id 5a718069fe44e98f663307ac12f808240ca846b4504ffc53d35d43afa68498a3 Oct 02 18:44:07 crc kubenswrapper[4909]: I1002 18:44:07.627892 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73c266b-a3db-431b-a40f-f0a5b9d06610" path="/var/lib/kubelet/pods/d73c266b-a3db-431b-a40f-f0a5b9d06610/volumes" Oct 02 18:44:08 crc kubenswrapper[4909]: I1002 18:44:08.561137 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f627398-76ee-40f8-9c82-47cc58ecb013","Type":"ContainerStarted","Data":"5a718069fe44e98f663307ac12f808240ca846b4504ffc53d35d43afa68498a3"} Oct 02 18:44:08 crc kubenswrapper[4909]: I1002 18:44:08.562404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc2c3682-578f-4f96-a535-d35eb31303c6","Type":"ContainerStarted","Data":"1873af0efaa4aa82ef215447a0bb69229c589ff3f47f78233452cecaae1cfa51"} Oct 02 18:44:08 crc kubenswrapper[4909]: I1002 18:44:08.567366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" event={"ID":"6d09b7b4-4a61-4c72-982f-635c697c12af","Type":"ContainerStarted","Data":"e27c4a453998ac72f03afb44cdd0c28c275c477c634b7471d22c459471182f69"} Oct 02 18:44:08 crc kubenswrapper[4909]: I1002 18:44:08.568305 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:08 crc kubenswrapper[4909]: I1002 18:44:08.629222 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" podStartSLOduration=8.629197376 podStartE2EDuration="8.629197376s" podCreationTimestamp="2025-10-02 18:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:44:08.627503863 +0000 UTC m=+1569.814999722" watchObservedRunningTime="2025-10-02 18:44:08.629197376 +0000 UTC m=+1569.816693275" Oct 02 18:44:09 crc kubenswrapper[4909]: I1002 18:44:09.594430 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerStarted","Data":"5e807d69683a3f44c396751eda53f710e39c98cf4d75d32c8a2df1a4f263ca8f"} Oct 02 18:44:10 crc kubenswrapper[4909]: I1002 18:44:10.608064 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f627398-76ee-40f8-9c82-47cc58ecb013","Type":"ContainerStarted","Data":"66a099c0c928f7372b75958c01120cbacedab79a54fe5e831269c4c4bebfe2ae"} Oct 02 18:44:12 crc kubenswrapper[4909]: E1002 18:44:12.927926 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" Oct 02 18:44:13 crc kubenswrapper[4909]: I1002 18:44:13.661692 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerStarted","Data":"7ae9166ea8d6783d5c427192faca28616a0e7c117c04e58f5a622c6432d83ccd"} Oct 02 18:44:13 crc kubenswrapper[4909]: I1002 18:44:13.661925 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 18:44:13 crc kubenswrapper[4909]: E1002 18:44:13.680543 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" Oct 02 18:44:14 crc kubenswrapper[4909]: E1002 18:44:14.686546 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" Oct 02 18:44:15 crc kubenswrapper[4909]: I1002 18:44:15.873292 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:15 crc kubenswrapper[4909]: I1002 18:44:15.985816 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-6lc6j"] Oct 02 18:44:15 crc kubenswrapper[4909]: I1002 18:44:15.989611 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerName="dnsmasq-dns" containerID="cri-o://ebabeb7a4d04573ba4147fecfd9dc7c2d1652e467e4709ffa1f03b1e63a14a20" gracePeriod=10 Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.128822 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768b698657-jrj72"] Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.130724 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.174861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-config\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.174938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-sb\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.174976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-svc\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.175001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-openstack-edpm-ipam\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.175066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-swift-storage-0\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.175086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vp96\" (UniqueName: \"kubernetes.io/projected/21f44fe5-d38a-40f0-9649-de47086a080a-kube-api-access-5vp96\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.175114 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-nb\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.175975 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768b698657-jrj72"] Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-openstack-edpm-ipam\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-swift-storage-0\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vp96\" (UniqueName: \"kubernetes.io/projected/21f44fe5-d38a-40f0-9649-de47086a080a-kube-api-access-5vp96\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277258 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-nb\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-config\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-sb\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.277432 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-svc\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.278364 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-openstack-edpm-ipam\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.278437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-svc\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.278608 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-sb\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.278626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-config\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.278646 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-swift-storage-0\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.278746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-nb\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.302076 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vp96\" (UniqueName: \"kubernetes.io/projected/21f44fe5-d38a-40f0-9649-de47086a080a-kube-api-access-5vp96\") pod \"dnsmasq-dns-768b698657-jrj72\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.474613 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.714904 4909 generic.go:334] "Generic (PLEG): container finished" podID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerID="ebabeb7a4d04573ba4147fecfd9dc7c2d1652e467e4709ffa1f03b1e63a14a20" exitCode=0 Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.715088 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" event={"ID":"cf07891f-01ab-41de-8d62-a85ff3e20102","Type":"ContainerDied","Data":"ebabeb7a4d04573ba4147fecfd9dc7c2d1652e467e4709ffa1f03b1e63a14a20"} Oct 02 18:44:16 crc kubenswrapper[4909]: I1002 18:44:16.985185 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768b698657-jrj72"] Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.235680 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.302163 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txcln\" (UniqueName: \"kubernetes.io/projected/cf07891f-01ab-41de-8d62-a85ff3e20102-kube-api-access-txcln\") pod \"cf07891f-01ab-41de-8d62-a85ff3e20102\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.302228 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-sb\") pod \"cf07891f-01ab-41de-8d62-a85ff3e20102\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.302338 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-swift-storage-0\") pod \"cf07891f-01ab-41de-8d62-a85ff3e20102\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.302438 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-nb\") pod \"cf07891f-01ab-41de-8d62-a85ff3e20102\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.302488 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-svc\") pod \"cf07891f-01ab-41de-8d62-a85ff3e20102\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.302532 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-config\") pod \"cf07891f-01ab-41de-8d62-a85ff3e20102\" (UID: \"cf07891f-01ab-41de-8d62-a85ff3e20102\") " Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.340016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf07891f-01ab-41de-8d62-a85ff3e20102-kube-api-access-txcln" (OuterVolumeSpecName: "kube-api-access-txcln") pod "cf07891f-01ab-41de-8d62-a85ff3e20102" (UID: "cf07891f-01ab-41de-8d62-a85ff3e20102"). InnerVolumeSpecName "kube-api-access-txcln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.366107 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-config" (OuterVolumeSpecName: "config") pod "cf07891f-01ab-41de-8d62-a85ff3e20102" (UID: "cf07891f-01ab-41de-8d62-a85ff3e20102"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.387918 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf07891f-01ab-41de-8d62-a85ff3e20102" (UID: "cf07891f-01ab-41de-8d62-a85ff3e20102"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.387714 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf07891f-01ab-41de-8d62-a85ff3e20102" (UID: "cf07891f-01ab-41de-8d62-a85ff3e20102"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.395816 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf07891f-01ab-41de-8d62-a85ff3e20102" (UID: "cf07891f-01ab-41de-8d62-a85ff3e20102"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.400935 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf07891f-01ab-41de-8d62-a85ff3e20102" (UID: "cf07891f-01ab-41de-8d62-a85ff3e20102"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.405251 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.405276 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.405288 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.405310 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.405320 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txcln\" (UniqueName: \"kubernetes.io/projected/cf07891f-01ab-41de-8d62-a85ff3e20102-kube-api-access-txcln\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.405329 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf07891f-01ab-41de-8d62-a85ff3e20102-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.740297 4909 generic.go:334] "Generic (PLEG): container finished" podID="21f44fe5-d38a-40f0-9649-de47086a080a" containerID="2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8" exitCode=0 Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.740387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-jrj72" event={"ID":"21f44fe5-d38a-40f0-9649-de47086a080a","Type":"ContainerDied","Data":"2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8"} Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.740420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-jrj72" event={"ID":"21f44fe5-d38a-40f0-9649-de47086a080a","Type":"ContainerStarted","Data":"fd60f184f5b890203faab9277fc169103915a0925afc9c8b19f9911f8fc1c2ed"} Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.744332 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" event={"ID":"cf07891f-01ab-41de-8d62-a85ff3e20102","Type":"ContainerDied","Data":"2c40dba25d7518b660adbc1776db946ed3f0e20eb4d044dbf78b86fdec7c4c8e"} Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.744401 4909 scope.go:117] "RemoveContainer" containerID="ebabeb7a4d04573ba4147fecfd9dc7c2d1652e467e4709ffa1f03b1e63a14a20" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.744639 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-6lc6j" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.779126 4909 scope.go:117] "RemoveContainer" containerID="91abda2f96b6133b3adc760820d74c635534ddb1a1b872bef5f3aafbe9be557b" Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.796313 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-6lc6j"] Oct 02 18:44:17 crc kubenswrapper[4909]: I1002 18:44:17.811772 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-6lc6j"] Oct 02 18:44:18 crc kubenswrapper[4909]: I1002 18:44:18.760468 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-jrj72" event={"ID":"21f44fe5-d38a-40f0-9649-de47086a080a","Type":"ContainerStarted","Data":"9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2"} Oct 02 18:44:18 crc kubenswrapper[4909]: I1002 18:44:18.761183 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:18 crc kubenswrapper[4909]: I1002 18:44:18.797239 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768b698657-jrj72" podStartSLOduration=2.797219169 podStartE2EDuration="2.797219169s" podCreationTimestamp="2025-10-02 18:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:44:18.785768358 +0000 UTC m=+1579.973264227" watchObservedRunningTime="2025-10-02 18:44:18.797219169 +0000 UTC m=+1579.984715038" Oct 02 18:44:19 crc kubenswrapper[4909]: I1002 18:44:19.027825 4909 scope.go:117] "RemoveContainer" containerID="d959cdb36cc3e4d4544fd4521ab4234f2a915f69ebcba54c4499da5c490b6432" Oct 02 18:44:19 crc kubenswrapper[4909]: I1002 18:44:19.065428 4909 scope.go:117] "RemoveContainer" containerID="716e80106b4a3c6ae0c559b68492ff4badbab0591931bc77196fc713177cf1e0" Oct 02 18:44:19 crc kubenswrapper[4909]: I1002 18:44:19.632203 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" path="/var/lib/kubelet/pods/cf07891f-01ab-41de-8d62-a85ff3e20102/volumes" Oct 02 18:44:22 crc kubenswrapper[4909]: I1002 18:44:22.832268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hfpp7" event={"ID":"337abd66-4416-4bbb-9ee9-29e704fabd94","Type":"ContainerStarted","Data":"06cc032cbcea5c5a595809e2f3dd4fc182f75930682766e85adc4866a41cafab"} Oct 02 18:44:22 crc kubenswrapper[4909]: I1002 18:44:22.857487 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-hfpp7" podStartSLOduration=2.016196773 podStartE2EDuration="42.857469493s" podCreationTimestamp="2025-10-02 18:43:40 +0000 UTC" firstStartedPulling="2025-10-02 18:43:40.983511079 +0000 UTC m=+1542.171006938" lastFinishedPulling="2025-10-02 18:44:21.824783799 +0000 UTC m=+1583.012279658" observedRunningTime="2025-10-02 18:44:22.852474435 +0000 UTC m=+1584.039970294" watchObservedRunningTime="2025-10-02 18:44:22.857469493 +0000 UTC m=+1584.044965352" Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.054996 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.055358 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.055490 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.056468 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.056655 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" gracePeriod=600 Oct 02 18:44:23 crc kubenswrapper[4909]: E1002 18:44:23.181979 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.849666 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" exitCode=0 Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.849697 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142"} Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.849851 4909 scope.go:117] "RemoveContainer" containerID="df159098c9d3023b7a6d6812151d5a2e4c4eebbf2424f0b96db2104b354ed569" Oct 02 18:44:23 crc kubenswrapper[4909]: I1002 18:44:23.851619 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:44:23 crc kubenswrapper[4909]: E1002 18:44:23.852659 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:44:24 crc kubenswrapper[4909]: I1002 18:44:24.869721 4909 generic.go:334] "Generic (PLEG): container finished" podID="337abd66-4416-4bbb-9ee9-29e704fabd94" containerID="06cc032cbcea5c5a595809e2f3dd4fc182f75930682766e85adc4866a41cafab" exitCode=0 Oct 02 18:44:24 crc kubenswrapper[4909]: I1002 18:44:24.869773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hfpp7" event={"ID":"337abd66-4416-4bbb-9ee9-29e704fabd94","Type":"ContainerDied","Data":"06cc032cbcea5c5a595809e2f3dd4fc182f75930682766e85adc4866a41cafab"} Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.443006 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hfpp7" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.476869 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.513896 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-combined-ca-bundle\") pod \"337abd66-4416-4bbb-9ee9-29e704fabd94\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.514227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-config-data\") pod \"337abd66-4416-4bbb-9ee9-29e704fabd94\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.514482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6kqf\" (UniqueName: \"kubernetes.io/projected/337abd66-4416-4bbb-9ee9-29e704fabd94-kube-api-access-t6kqf\") pod \"337abd66-4416-4bbb-9ee9-29e704fabd94\" (UID: \"337abd66-4416-4bbb-9ee9-29e704fabd94\") " Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.533338 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337abd66-4416-4bbb-9ee9-29e704fabd94-kube-api-access-t6kqf" (OuterVolumeSpecName: "kube-api-access-t6kqf") pod "337abd66-4416-4bbb-9ee9-29e704fabd94" (UID: "337abd66-4416-4bbb-9ee9-29e704fabd94"). InnerVolumeSpecName "kube-api-access-t6kqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.588971 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-zrqs2"] Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.589265 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerName="dnsmasq-dns" containerID="cri-o://e27c4a453998ac72f03afb44cdd0c28c275c477c634b7471d22c459471182f69" gracePeriod=10 Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.618612 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6kqf\" (UniqueName: \"kubernetes.io/projected/337abd66-4416-4bbb-9ee9-29e704fabd94-kube-api-access-t6kqf\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.644524 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "337abd66-4416-4bbb-9ee9-29e704fabd94" (UID: "337abd66-4416-4bbb-9ee9-29e704fabd94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.676190 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-config-data" (OuterVolumeSpecName: "config-data") pod "337abd66-4416-4bbb-9ee9-29e704fabd94" (UID: "337abd66-4416-4bbb-9ee9-29e704fabd94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.720149 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.720179 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337abd66-4416-4bbb-9ee9-29e704fabd94-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.898617 4909 generic.go:334] "Generic (PLEG): container finished" podID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerID="e27c4a453998ac72f03afb44cdd0c28c275c477c634b7471d22c459471182f69" exitCode=0 Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.898765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" event={"ID":"6d09b7b4-4a61-4c72-982f-635c697c12af","Type":"ContainerDied","Data":"e27c4a453998ac72f03afb44cdd0c28c275c477c634b7471d22c459471182f69"} Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.901568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hfpp7" event={"ID":"337abd66-4416-4bbb-9ee9-29e704fabd94","Type":"ContainerDied","Data":"3aadb5bce0148adb03934279d0f6e326bf9f38b7028da9128cc94f204f1899a9"} Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.901621 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aadb5bce0148adb03934279d0f6e326bf9f38b7028da9128cc94f204f1899a9" Oct 02 18:44:26 crc kubenswrapper[4909]: I1002 18:44:26.901645 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hfpp7" Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.788479 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.911797 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" event={"ID":"6d09b7b4-4a61-4c72-982f-635c697c12af","Type":"ContainerDied","Data":"6c57c55be2bca5dc0b2e29caf586fd4ec511902ebaffea55e3abbf3930259d4c"} Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.911843 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-zrqs2" Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.911852 4909 scope.go:117] "RemoveContainer" containerID="e27c4a453998ac72f03afb44cdd0c28c275c477c634b7471d22c459471182f69" Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.939946 4909 scope.go:117] "RemoveContainer" containerID="30f4ca4f34045eb0a9ff4db2ce98ee2dda0d05dc0bb8bfe545970a0608e3c29d" Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949445 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssfjh\" (UniqueName: \"kubernetes.io/projected/6d09b7b4-4a61-4c72-982f-635c697c12af-kube-api-access-ssfjh\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949591 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-sb\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-svc\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949662 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-nb\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949678 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-config\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949719 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-swift-storage-0\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.949824 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-openstack-edpm-ipam\") pod \"6d09b7b4-4a61-4c72-982f-635c697c12af\" (UID: \"6d09b7b4-4a61-4c72-982f-635c697c12af\") " Oct 02 18:44:27 crc kubenswrapper[4909]: I1002 18:44:27.964346 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d09b7b4-4a61-4c72-982f-635c697c12af-kube-api-access-ssfjh" (OuterVolumeSpecName: "kube-api-access-ssfjh") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "kube-api-access-ssfjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.020978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.027800 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-config" (OuterVolumeSpecName: "config") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.030553 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.031653 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.042277 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.044053 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d09b7b4-4a61-4c72-982f-635c697c12af" (UID: "6d09b7b4-4a61-4c72-982f-635c697c12af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052172 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052193 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssfjh\" (UniqueName: \"kubernetes.io/projected/6d09b7b4-4a61-4c72-982f-635c697c12af-kube-api-access-ssfjh\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052205 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052213 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052222 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052230 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-config\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.052238 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d09b7b4-4a61-4c72-982f-635c697c12af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.255782 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-zrqs2"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.265322 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-zrqs2"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.523562 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6fd97cfc64-gbtcw"] Oct 02 18:44:28 crc kubenswrapper[4909]: E1002 18:44:28.524098 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerName="init" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524115 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerName="init" Oct 02 18:44:28 crc kubenswrapper[4909]: E1002 18:44:28.524135 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerName="dnsmasq-dns" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524144 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerName="dnsmasq-dns" Oct 02 18:44:28 crc kubenswrapper[4909]: E1002 18:44:28.524162 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337abd66-4416-4bbb-9ee9-29e704fabd94" containerName="heat-db-sync" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524171 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="337abd66-4416-4bbb-9ee9-29e704fabd94" containerName="heat-db-sync" Oct 02 18:44:28 crc kubenswrapper[4909]: E1002 18:44:28.524196 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerName="init" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524206 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerName="init" Oct 02 18:44:28 crc kubenswrapper[4909]: E1002 18:44:28.524230 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerName="dnsmasq-dns" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524238 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerName="dnsmasq-dns" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524544 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="337abd66-4416-4bbb-9ee9-29e704fabd94" containerName="heat-db-sync" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524609 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf07891f-01ab-41de-8d62-a85ff3e20102" containerName="dnsmasq-dns" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.524638 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" containerName="dnsmasq-dns" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.525673 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.540371 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6fd97cfc64-gbtcw"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.561250 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-85bddbb496-tlzmz"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.562755 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.633927 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6cc7795987-nsplf"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.637210 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.648727 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-85bddbb496-tlzmz"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.663916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsx4\" (UniqueName: \"kubernetes.io/projected/b0981006-350f-4a53-85d4-35a0bb5c3eca-kube-api-access-kbsx4\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.663997 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-config-data\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664072 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-config-data-custom\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-config-data-custom\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-internal-tls-certs\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-config-data\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-combined-ca-bundle\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664274 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-combined-ca-bundle\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664317 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgn4g\" (UniqueName: \"kubernetes.io/projected/8b922021-6783-4a96-8ee1-a571074d1f49-kube-api-access-fgn4g\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.664396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-public-tls-certs\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.669813 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cc7795987-nsplf"] Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-config-data-custom\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766381 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-internal-tls-certs\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-config-data\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-combined-ca-bundle\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766509 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-config-data\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-combined-ca-bundle\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766546 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgn4g\" (UniqueName: \"kubernetes.io/projected/8b922021-6783-4a96-8ee1-a571074d1f49-kube-api-access-fgn4g\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-internal-tls-certs\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766609 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-public-tls-certs\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766630 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26842\" (UniqueName: \"kubernetes.io/projected/9220d7fd-14fd-44e5-ba47-2b4038b7472f-kube-api-access-26842\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766658 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsx4\" (UniqueName: \"kubernetes.io/projected/b0981006-350f-4a53-85d4-35a0bb5c3eca-kube-api-access-kbsx4\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766723 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-config-data\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766755 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-public-tls-certs\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766794 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-config-data-custom\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766810 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-combined-ca-bundle\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.766828 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-config-data-custom\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.770059 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-config-data-custom\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.772480 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-config-data-custom\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.772658 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-public-tls-certs\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.772827 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-combined-ca-bundle\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.775804 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-config-data\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.776176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-internal-tls-certs\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.775056 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b922021-6783-4a96-8ee1-a571074d1f49-combined-ca-bundle\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.788752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0981006-350f-4a53-85d4-35a0bb5c3eca-config-data\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.791494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsx4\" (UniqueName: \"kubernetes.io/projected/b0981006-350f-4a53-85d4-35a0bb5c3eca-kube-api-access-kbsx4\") pod \"heat-engine-6fd97cfc64-gbtcw\" (UID: \"b0981006-350f-4a53-85d4-35a0bb5c3eca\") " pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.791948 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgn4g\" (UniqueName: \"kubernetes.io/projected/8b922021-6783-4a96-8ee1-a571074d1f49-kube-api-access-fgn4g\") pod \"heat-api-85bddbb496-tlzmz\" (UID: \"8b922021-6783-4a96-8ee1-a571074d1f49\") " pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.868322 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-config-data-custom\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.868597 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-combined-ca-bundle\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.869522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-config-data\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.869676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-internal-tls-certs\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.869778 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26842\" (UniqueName: \"kubernetes.io/projected/9220d7fd-14fd-44e5-ba47-2b4038b7472f-kube-api-access-26842\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.869969 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-public-tls-certs\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.872763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-combined-ca-bundle\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.873480 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-internal-tls-certs\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.873671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-public-tls-certs\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.872777 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-config-data-custom\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.875705 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9220d7fd-14fd-44e5-ba47-2b4038b7472f-config-data\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.891535 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26842\" (UniqueName: \"kubernetes.io/projected/9220d7fd-14fd-44e5-ba47-2b4038b7472f-kube-api-access-26842\") pod \"heat-cfnapi-6cc7795987-nsplf\" (UID: \"9220d7fd-14fd-44e5-ba47-2b4038b7472f\") " pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.893098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.933684 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:28 crc kubenswrapper[4909]: I1002 18:44:28.957138 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:29 crc kubenswrapper[4909]: W1002 18:44:29.426215 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0981006_350f_4a53_85d4_35a0bb5c3eca.slice/crio-07bf84cb068337c3968214b10d56422ede67f233204f64211647393b3c77f432 WatchSource:0}: Error finding container 07bf84cb068337c3968214b10d56422ede67f233204f64211647393b3c77f432: Status 404 returned error can't find the container with id 07bf84cb068337c3968214b10d56422ede67f233204f64211647393b3c77f432 Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.430483 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6fd97cfc64-gbtcw"] Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.525152 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-85bddbb496-tlzmz"] Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.533201 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cc7795987-nsplf"] Oct 02 18:44:29 crc kubenswrapper[4909]: W1002 18:44:29.543229 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b922021_6783_4a96_8ee1_a571074d1f49.slice/crio-11efa7a0015a7c9f61a8d4ecfa2feee3af8d85c13f05933ca925e14f5856d311 WatchSource:0}: Error finding container 11efa7a0015a7c9f61a8d4ecfa2feee3af8d85c13f05933ca925e14f5856d311: Status 404 returned error can't find the container with id 11efa7a0015a7c9f61a8d4ecfa2feee3af8d85c13f05933ca925e14f5856d311 Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.628174 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d09b7b4-4a61-4c72-982f-635c697c12af" path="/var/lib/kubelet/pods/6d09b7b4-4a61-4c72-982f-635c697c12af/volumes" Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.645800 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.945604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc7795987-nsplf" event={"ID":"9220d7fd-14fd-44e5-ba47-2b4038b7472f","Type":"ContainerStarted","Data":"8580a271c37e5aae80da7ab01363bf7116e4f46bf6136eba251ef51d2150108a"} Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.948662 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6fd97cfc64-gbtcw" event={"ID":"b0981006-350f-4a53-85d4-35a0bb5c3eca","Type":"ContainerStarted","Data":"18b94d51dc5d7887d7d4e9672fa43080c0b2087d2a458880d2db06bb079c4a6e"} Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.948684 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6fd97cfc64-gbtcw" event={"ID":"b0981006-350f-4a53-85d4-35a0bb5c3eca","Type":"ContainerStarted","Data":"07bf84cb068337c3968214b10d56422ede67f233204f64211647393b3c77f432"} Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.949810 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.953862 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85bddbb496-tlzmz" event={"ID":"8b922021-6783-4a96-8ee1-a571074d1f49","Type":"ContainerStarted","Data":"11efa7a0015a7c9f61a8d4ecfa2feee3af8d85c13f05933ca925e14f5856d311"} Oct 02 18:44:29 crc kubenswrapper[4909]: I1002 18:44:29.969039 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6fd97cfc64-gbtcw" podStartSLOduration=1.9690106950000001 podStartE2EDuration="1.969010695s" podCreationTimestamp="2025-10-02 18:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:44:29.962428227 +0000 UTC m=+1591.149924086" watchObservedRunningTime="2025-10-02 18:44:29.969010695 +0000 UTC m=+1591.156506554" Oct 02 18:44:30 crc kubenswrapper[4909]: I1002 18:44:30.971251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerStarted","Data":"2e719bf839e5244b72c7c63a7c45850173203ef6e7cc30d052704a825fd95864"} Oct 02 18:44:31 crc kubenswrapper[4909]: I1002 18:44:31.980843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85bddbb496-tlzmz" event={"ID":"8b922021-6783-4a96-8ee1-a571074d1f49","Type":"ContainerStarted","Data":"da6a37700b68746545845f30fccdcafbfe8b352216afc1a23c7f7b2488579995"} Oct 02 18:44:31 crc kubenswrapper[4909]: I1002 18:44:31.981405 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:31 crc kubenswrapper[4909]: I1002 18:44:31.982471 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cc7795987-nsplf" event={"ID":"9220d7fd-14fd-44e5-ba47-2b4038b7472f","Type":"ContainerStarted","Data":"b1c00ec57dfe5782a0c5c6b620270249a54c4dad513a6eb5a53724bddc79cb70"} Oct 02 18:44:32 crc kubenswrapper[4909]: I1002 18:44:32.004600 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.523265376 podStartE2EDuration="48.004581843s" podCreationTimestamp="2025-10-02 18:43:44 +0000 UTC" firstStartedPulling="2025-10-02 18:43:45.386889912 +0000 UTC m=+1546.574385771" lastFinishedPulling="2025-10-02 18:44:29.868206379 +0000 UTC m=+1591.055702238" observedRunningTime="2025-10-02 18:44:30.994472901 +0000 UTC m=+1592.181968780" watchObservedRunningTime="2025-10-02 18:44:32.004581843 +0000 UTC m=+1593.192077702" Oct 02 18:44:32 crc kubenswrapper[4909]: I1002 18:44:32.008146 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-85bddbb496-tlzmz" podStartSLOduration=2.452978662 podStartE2EDuration="4.008140115s" podCreationTimestamp="2025-10-02 18:44:28 +0000 UTC" firstStartedPulling="2025-10-02 18:44:29.545336787 +0000 UTC m=+1590.732832646" lastFinishedPulling="2025-10-02 18:44:31.10049823 +0000 UTC m=+1592.287994099" observedRunningTime="2025-10-02 18:44:32.00037313 +0000 UTC m=+1593.187868999" watchObservedRunningTime="2025-10-02 18:44:32.008140115 +0000 UTC m=+1593.195635974" Oct 02 18:44:32 crc kubenswrapper[4909]: I1002 18:44:32.017603 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6cc7795987-nsplf" podStartSLOduration=2.47234648 podStartE2EDuration="4.017579732s" podCreationTimestamp="2025-10-02 18:44:28 +0000 UTC" firstStartedPulling="2025-10-02 18:44:29.550708216 +0000 UTC m=+1590.738204075" lastFinishedPulling="2025-10-02 18:44:31.095941468 +0000 UTC m=+1592.283437327" observedRunningTime="2025-10-02 18:44:32.01463979 +0000 UTC m=+1593.202135649" watchObservedRunningTime="2025-10-02 18:44:32.017579732 +0000 UTC m=+1593.205075591" Oct 02 18:44:32 crc kubenswrapper[4909]: I1002 18:44:32.991292 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:36 crc kubenswrapper[4909]: I1002 18:44:36.611896 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:44:36 crc kubenswrapper[4909]: E1002 18:44:36.613372 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.334414 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl"] Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.335975 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.342284 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.343431 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.343555 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.343681 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.370361 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl"] Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.494884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlgp\" (UniqueName: \"kubernetes.io/projected/267cf7a5-8352-42ca-be8a-588a26e74159-kube-api-access-xnlgp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.495072 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.495149 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.495365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.597303 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.597385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.597440 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.597542 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlgp\" (UniqueName: \"kubernetes.io/projected/267cf7a5-8352-42ca-be8a-588a26e74159-kube-api-access-xnlgp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.604982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.606759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.610719 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.627924 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlgp\" (UniqueName: \"kubernetes.io/projected/267cf7a5-8352-42ca-be8a-588a26e74159-kube-api-access-xnlgp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:38 crc kubenswrapper[4909]: I1002 18:44:38.669639 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:44:39 crc kubenswrapper[4909]: I1002 18:44:39.268622 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl"] Oct 02 18:44:40 crc kubenswrapper[4909]: I1002 18:44:40.089149 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" event={"ID":"267cf7a5-8352-42ca-be8a-588a26e74159","Type":"ContainerStarted","Data":"ad7b4c8df87c45f9dbbedeffecba14712853c04dea6c9479a87c1ec44b79b455"} Oct 02 18:44:41 crc kubenswrapper[4909]: I1002 18:44:41.530581 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-85bddbb496-tlzmz" Oct 02 18:44:41 crc kubenswrapper[4909]: I1002 18:44:41.552048 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6cc7795987-nsplf" Oct 02 18:44:41 crc kubenswrapper[4909]: I1002 18:44:41.640661 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5d74c74f69-9pjww"] Oct 02 18:44:41 crc kubenswrapper[4909]: I1002 18:44:41.641828 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5d74c74f69-9pjww" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerName="heat-api" containerID="cri-o://b01d645485bba5753eb015244c7bb356a5cfcecb4ac8d8193374c4b5fdd50109" gracePeriod=60 Oct 02 18:44:41 crc kubenswrapper[4909]: I1002 18:44:41.699276 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5446d8bcf6-rnrrj"] Oct 02 18:44:41 crc kubenswrapper[4909]: I1002 18:44:41.699745 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerName="heat-cfnapi" containerID="cri-o://53c16e2793e47e764f80056293ee032fca975d5f5405993af1861bfd23a216f0" gracePeriod=60 Oct 02 18:44:42 crc kubenswrapper[4909]: I1002 18:44:42.125533 4909 generic.go:334] "Generic (PLEG): container finished" podID="fc2c3682-578f-4f96-a535-d35eb31303c6" containerID="1873af0efaa4aa82ef215447a0bb69229c589ff3f47f78233452cecaae1cfa51" exitCode=0 Oct 02 18:44:42 crc kubenswrapper[4909]: I1002 18:44:42.125577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc2c3682-578f-4f96-a535-d35eb31303c6","Type":"ContainerDied","Data":"1873af0efaa4aa82ef215447a0bb69229c589ff3f47f78233452cecaae1cfa51"} Oct 02 18:44:43 crc kubenswrapper[4909]: I1002 18:44:43.145369 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc2c3682-578f-4f96-a535-d35eb31303c6","Type":"ContainerStarted","Data":"b61e5b5876fb980f3a561487fded485f13ff2fc28d3f9ad1e6ba13941e758ce4"} Oct 02 18:44:43 crc kubenswrapper[4909]: I1002 18:44:43.146067 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 18:44:43 crc kubenswrapper[4909]: I1002 18:44:43.188547 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.188517941 podStartE2EDuration="39.188517941s" podCreationTimestamp="2025-10-02 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:44:43.18120084 +0000 UTC m=+1604.368696709" watchObservedRunningTime="2025-10-02 18:44:43.188517941 +0000 UTC m=+1604.376013800" Oct 02 18:44:44 crc kubenswrapper[4909]: I1002 18:44:44.161487 4909 generic.go:334] "Generic (PLEG): container finished" podID="5f627398-76ee-40f8-9c82-47cc58ecb013" containerID="66a099c0c928f7372b75958c01120cbacedab79a54fe5e831269c4c4bebfe2ae" exitCode=0 Oct 02 18:44:44 crc kubenswrapper[4909]: I1002 18:44:44.161570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f627398-76ee-40f8-9c82-47cc58ecb013","Type":"ContainerDied","Data":"66a099c0c928f7372b75958c01120cbacedab79a54fe5e831269c4c4bebfe2ae"} Oct 02 18:44:45 crc kubenswrapper[4909]: I1002 18:44:45.126196 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5d74c74f69-9pjww" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.205:8004/healthcheck\": read tcp 10.217.0.2:38448->10.217.0.205:8004: read: connection reset by peer" Oct 02 18:44:45 crc kubenswrapper[4909]: I1002 18:44:45.145994 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.206:8000/healthcheck\": read tcp 10.217.0.2:60562->10.217.0.206:8000: read: connection reset by peer" Oct 02 18:44:45 crc kubenswrapper[4909]: I1002 18:44:45.173972 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f627398-76ee-40f8-9c82-47cc58ecb013","Type":"ContainerStarted","Data":"2755da41b6b963bb8c284b2d028cef279bc167580b7d48ea3644a134533f4980"} Oct 02 18:44:45 crc kubenswrapper[4909]: I1002 18:44:45.175191 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:45 crc kubenswrapper[4909]: I1002 18:44:45.176609 4909 generic.go:334] "Generic (PLEG): container finished" podID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerID="b01d645485bba5753eb015244c7bb356a5cfcecb4ac8d8193374c4b5fdd50109" exitCode=0 Oct 02 18:44:45 crc kubenswrapper[4909]: I1002 18:44:45.176633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5d74c74f69-9pjww" event={"ID":"d90f5cd7-6d13-41b6-8c6f-86121b523321","Type":"ContainerDied","Data":"b01d645485bba5753eb015244c7bb356a5cfcecb4ac8d8193374c4b5fdd50109"} Oct 02 18:44:46 crc kubenswrapper[4909]: I1002 18:44:46.188414 4909 generic.go:334] "Generic (PLEG): container finished" podID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerID="53c16e2793e47e764f80056293ee032fca975d5f5405993af1861bfd23a216f0" exitCode=0 Oct 02 18:44:46 crc kubenswrapper[4909]: I1002 18:44:46.188497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" event={"ID":"e56e6356-0b90-4570-8068-d341cf2c7b50","Type":"ContainerDied","Data":"53c16e2793e47e764f80056293ee032fca975d5f5405993af1861bfd23a216f0"} Oct 02 18:44:48 crc kubenswrapper[4909]: I1002 18:44:48.940643 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6fd97cfc64-gbtcw" Oct 02 18:44:48 crc kubenswrapper[4909]: I1002 18:44:48.967449 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.967423869 podStartE2EDuration="42.967423869s" podCreationTimestamp="2025-10-02 18:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:44:45.196642125 +0000 UTC m=+1606.384137984" watchObservedRunningTime="2025-10-02 18:44:48.967423869 +0000 UTC m=+1610.154919758" Oct 02 18:44:48 crc kubenswrapper[4909]: I1002 18:44:48.994300 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7b47684999-l2rd2"] Oct 02 18:44:48 crc kubenswrapper[4909]: I1002 18:44:48.994543 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7b47684999-l2rd2" podUID="c60eba96-3123-47ee-a374-50889549cc50" containerName="heat-engine" containerID="cri-o://66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" gracePeriod=60 Oct 02 18:44:50 crc kubenswrapper[4909]: I1002 18:44:50.608061 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:44:50 crc kubenswrapper[4909]: E1002 18:44:50.608711 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:44:52 crc kubenswrapper[4909]: E1002 18:44:52.020873 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:44:52 crc kubenswrapper[4909]: E1002 18:44:52.022794 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:44:52 crc kubenswrapper[4909]: E1002 18:44:52.023838 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:44:52 crc kubenswrapper[4909]: E1002 18:44:52.023924 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7b47684999-l2rd2" podUID="c60eba96-3123-47ee-a374-50889549cc50" containerName="heat-engine" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.646488 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.655126 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.844831 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-combined-ca-bundle\") pod \"d90f5cd7-6d13-41b6-8c6f-86121b523321\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.844947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data\") pod \"d90f5cd7-6d13-41b6-8c6f-86121b523321\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.844978 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swrz4\" (UniqueName: \"kubernetes.io/projected/e56e6356-0b90-4570-8068-d341cf2c7b50-kube-api-access-swrz4\") pod \"e56e6356-0b90-4570-8068-d341cf2c7b50\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845028 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-public-tls-certs\") pod \"d90f5cd7-6d13-41b6-8c6f-86121b523321\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845117 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-combined-ca-bundle\") pod \"e56e6356-0b90-4570-8068-d341cf2c7b50\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data\") pod \"e56e6356-0b90-4570-8068-d341cf2c7b50\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845260 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-internal-tls-certs\") pod \"d90f5cd7-6d13-41b6-8c6f-86121b523321\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845287 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data-custom\") pod \"d90f5cd7-6d13-41b6-8c6f-86121b523321\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845312 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data-custom\") pod \"e56e6356-0b90-4570-8068-d341cf2c7b50\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845386 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w45m2\" (UniqueName: \"kubernetes.io/projected/d90f5cd7-6d13-41b6-8c6f-86121b523321-kube-api-access-w45m2\") pod \"d90f5cd7-6d13-41b6-8c6f-86121b523321\" (UID: \"d90f5cd7-6d13-41b6-8c6f-86121b523321\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-public-tls-certs\") pod \"e56e6356-0b90-4570-8068-d341cf2c7b50\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.845492 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-internal-tls-certs\") pod \"e56e6356-0b90-4570-8068-d341cf2c7b50\" (UID: \"e56e6356-0b90-4570-8068-d341cf2c7b50\") " Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.852159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90f5cd7-6d13-41b6-8c6f-86121b523321-kube-api-access-w45m2" (OuterVolumeSpecName: "kube-api-access-w45m2") pod "d90f5cd7-6d13-41b6-8c6f-86121b523321" (UID: "d90f5cd7-6d13-41b6-8c6f-86121b523321"). InnerVolumeSpecName "kube-api-access-w45m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.863813 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e56e6356-0b90-4570-8068-d341cf2c7b50" (UID: "e56e6356-0b90-4570-8068-d341cf2c7b50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.866423 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d90f5cd7-6d13-41b6-8c6f-86121b523321" (UID: "d90f5cd7-6d13-41b6-8c6f-86121b523321"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.867221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56e6356-0b90-4570-8068-d341cf2c7b50-kube-api-access-swrz4" (OuterVolumeSpecName: "kube-api-access-swrz4") pod "e56e6356-0b90-4570-8068-d341cf2c7b50" (UID: "e56e6356-0b90-4570-8068-d341cf2c7b50"). InnerVolumeSpecName "kube-api-access-swrz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.889592 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d90f5cd7-6d13-41b6-8c6f-86121b523321" (UID: "d90f5cd7-6d13-41b6-8c6f-86121b523321"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.895415 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e56e6356-0b90-4570-8068-d341cf2c7b50" (UID: "e56e6356-0b90-4570-8068-d341cf2c7b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.925110 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data" (OuterVolumeSpecName: "config-data") pod "d90f5cd7-6d13-41b6-8c6f-86121b523321" (UID: "d90f5cd7-6d13-41b6-8c6f-86121b523321"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.931104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data" (OuterVolumeSpecName: "config-data") pod "e56e6356-0b90-4570-8068-d341cf2c7b50" (UID: "e56e6356-0b90-4570-8068-d341cf2c7b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.940042 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d90f5cd7-6d13-41b6-8c6f-86121b523321" (UID: "d90f5cd7-6d13-41b6-8c6f-86121b523321"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948007 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w45m2\" (UniqueName: \"kubernetes.io/projected/d90f5cd7-6d13-41b6-8c6f-86121b523321-kube-api-access-w45m2\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948043 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948053 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948061 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swrz4\" (UniqueName: \"kubernetes.io/projected/e56e6356-0b90-4570-8068-d341cf2c7b50-kube-api-access-swrz4\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948069 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948079 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948088 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948096 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948104 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.948165 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d90f5cd7-6d13-41b6-8c6f-86121b523321" (UID: "d90f5cd7-6d13-41b6-8c6f-86121b523321"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.965822 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e56e6356-0b90-4570-8068-d341cf2c7b50" (UID: "e56e6356-0b90-4570-8068-d341cf2c7b50"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:52 crc kubenswrapper[4909]: I1002 18:44:52.983943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e56e6356-0b90-4570-8068-d341cf2c7b50" (UID: "e56e6356-0b90-4570-8068-d341cf2c7b50"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.049796 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.049837 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e56e6356-0b90-4570-8068-d341cf2c7b50-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.049849 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90f5cd7-6d13-41b6-8c6f-86121b523321-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.275006 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5d74c74f69-9pjww" event={"ID":"d90f5cd7-6d13-41b6-8c6f-86121b523321","Type":"ContainerDied","Data":"11783afa0682a494e1319772cadd242637802d6194ad694d2550f71e6ef16f0d"} Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.275066 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5d74c74f69-9pjww" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.275383 4909 scope.go:117] "RemoveContainer" containerID="b01d645485bba5753eb015244c7bb356a5cfcecb4ac8d8193374c4b5fdd50109" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.278365 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.278375 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" event={"ID":"e56e6356-0b90-4570-8068-d341cf2c7b50","Type":"ContainerDied","Data":"4de51374614b9836c11ce7f131f4827e9cc9b5b703738f4dab4db493f35eb6c9"} Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.280689 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" event={"ID":"267cf7a5-8352-42ca-be8a-588a26e74159","Type":"ContainerStarted","Data":"86c7b37c85444e7e05ddbb33e8352baf763989e976e2246f51300b7f0ef77ee3"} Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.328541 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" podStartSLOduration=1.623840579 podStartE2EDuration="15.328518851s" podCreationTimestamp="2025-10-02 18:44:38 +0000 UTC" firstStartedPulling="2025-10-02 18:44:39.293206103 +0000 UTC m=+1600.480701972" lastFinishedPulling="2025-10-02 18:44:52.997884375 +0000 UTC m=+1614.185380244" observedRunningTime="2025-10-02 18:44:53.312454925 +0000 UTC m=+1614.499950794" watchObservedRunningTime="2025-10-02 18:44:53.328518851 +0000 UTC m=+1614.516014710" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.343978 4909 scope.go:117] "RemoveContainer" containerID="53c16e2793e47e764f80056293ee032fca975d5f5405993af1861bfd23a216f0" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.359462 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5d74c74f69-9pjww"] Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.376422 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5d74c74f69-9pjww"] Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.391102 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5446d8bcf6-rnrrj"] Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.402009 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5446d8bcf6-rnrrj"] Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.619617 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" path="/var/lib/kubelet/pods/d90f5cd7-6d13-41b6-8c6f-86121b523321/volumes" Oct 02 18:44:53 crc kubenswrapper[4909]: I1002 18:44:53.620718 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" path="/var/lib/kubelet/pods/e56e6356-0b90-4570-8068-d341cf2c7b50/volumes" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.240219 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.540256 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-425k4"] Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.560755 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-425k4"] Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.621863 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63441e27-e4a8-4db3-9c60-5cd65f0e136a" path="/var/lib/kubelet/pods/63441e27-e4a8-4db3-9c60-5cd65f0e136a/volumes" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.622480 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-h9bx9"] Oct 02 18:44:55 crc kubenswrapper[4909]: E1002 18:44:55.622812 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerName="heat-api" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.622825 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerName="heat-api" Oct 02 18:44:55 crc kubenswrapper[4909]: E1002 18:44:55.622859 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerName="heat-cfnapi" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.622866 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerName="heat-cfnapi" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.623064 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerName="heat-cfnapi" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.623091 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerName="heat-api" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.623836 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.721344 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-h9bx9"] Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.805765 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjnsc\" (UniqueName: \"kubernetes.io/projected/52cdb0d6-c8c2-4646-8350-f63892d098f5-kube-api-access-tjnsc\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.805857 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-combined-ca-bundle\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.806076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-scripts\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.806235 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-config-data\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.908683 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjnsc\" (UniqueName: \"kubernetes.io/projected/52cdb0d6-c8c2-4646-8350-f63892d098f5-kube-api-access-tjnsc\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.908776 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-combined-ca-bundle\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.908835 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-scripts\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.908866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-config-data\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.915678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-scripts\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.916412 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-combined-ca-bundle\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.925716 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjnsc\" (UniqueName: \"kubernetes.io/projected/52cdb0d6-c8c2-4646-8350-f63892d098f5-kube-api-access-tjnsc\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.932823 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-config-data\") pod \"aodh-db-sync-h9bx9\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:55 crc kubenswrapper[4909]: I1002 18:44:55.955145 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:44:56 crc kubenswrapper[4909]: I1002 18:44:56.952239 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 18:44:58 crc kubenswrapper[4909]: I1002 18:44:58.656926 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5d74c74f69-9pjww" podUID="d90f5cd7-6d13-41b6-8c6f-86121b523321" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.205:8004/healthcheck\": context deadline exceeded" Oct 02 18:44:58 crc kubenswrapper[4909]: I1002 18:44:58.689869 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5446d8bcf6-rnrrj" podUID="e56e6356-0b90-4570-8068-d341cf2c7b50" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.206:8000/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 18:44:58 crc kubenswrapper[4909]: I1002 18:44:58.782263 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-h9bx9"] Oct 02 18:44:59 crc kubenswrapper[4909]: I1002 18:44:59.381609 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h9bx9" event={"ID":"52cdb0d6-c8c2-4646-8350-f63892d098f5","Type":"ContainerStarted","Data":"97a8028ea3e25fd04b84c14bf3b33aa3e31e7daac9a10f908cea48d1caf58cde"} Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.139821 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms"] Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.141720 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.143728 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.148503 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.159258 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms"] Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.200816 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-secret-volume\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.200924 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p9fn\" (UniqueName: \"kubernetes.io/projected/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-kube-api-access-6p9fn\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.200992 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-config-volume\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.303135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-config-volume\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.303251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-secret-volume\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.303369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p9fn\" (UniqueName: \"kubernetes.io/projected/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-kube-api-access-6p9fn\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.303986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-config-volume\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.321966 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-secret-volume\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.334118 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p9fn\" (UniqueName: \"kubernetes.io/projected/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-kube-api-access-6p9fn\") pod \"collect-profiles-29323845-f57ms\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:00 crc kubenswrapper[4909]: I1002 18:45:00.513203 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:01 crc kubenswrapper[4909]: I1002 18:45:01.002760 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms"] Oct 02 18:45:01 crc kubenswrapper[4909]: I1002 18:45:01.609737 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:45:01 crc kubenswrapper[4909]: E1002 18:45:01.610062 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:45:02 crc kubenswrapper[4909]: E1002 18:45:02.022368 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:45:02 crc kubenswrapper[4909]: E1002 18:45:02.024318 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:45:02 crc kubenswrapper[4909]: E1002 18:45:02.025552 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 02 18:45:02 crc kubenswrapper[4909]: E1002 18:45:02.025589 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7b47684999-l2rd2" podUID="c60eba96-3123-47ee-a374-50889549cc50" containerName="heat-engine" Oct 02 18:45:03 crc kubenswrapper[4909]: W1002 18:45:03.429218 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914fac7d_d3a1_4cb1_b2d7_8e8821f08e15.slice/crio-dbf61fe7680de2589d34f665c256ce3f30340822ccf706df172b4f6eda55e11b WatchSource:0}: Error finding container dbf61fe7680de2589d34f665c256ce3f30340822ccf706df172b4f6eda55e11b: Status 404 returned error can't find the container with id dbf61fe7680de2589d34f665c256ce3f30340822ccf706df172b4f6eda55e11b Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.455371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" event={"ID":"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15","Type":"ContainerStarted","Data":"04dcec55e75c6baf2c55053dd968c5fc5960fec99a4a5909136dc166788b6cc1"} Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.456003 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" event={"ID":"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15","Type":"ContainerStarted","Data":"dbf61fe7680de2589d34f665c256ce3f30340822ccf706df172b4f6eda55e11b"} Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.464194 4909 generic.go:334] "Generic (PLEG): container finished" podID="c60eba96-3123-47ee-a374-50889549cc50" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" exitCode=0 Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.464281 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b47684999-l2rd2" event={"ID":"c60eba96-3123-47ee-a374-50889549cc50","Type":"ContainerDied","Data":"66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85"} Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.477638 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h9bx9" event={"ID":"52cdb0d6-c8c2-4646-8350-f63892d098f5","Type":"ContainerStarted","Data":"7273aaf94f67fdc171eda0bb780fcd929bfe9ee12c8227aa61bb9a334adc6bef"} Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.478329 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" podStartSLOduration=4.47830942 podStartE2EDuration="4.47830942s" podCreationTimestamp="2025-10-02 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 18:45:04.476496893 +0000 UTC m=+1625.663992752" watchObservedRunningTime="2025-10-02 18:45:04.47830942 +0000 UTC m=+1625.665805279" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.522289 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-h9bx9" podStartSLOduration=4.6709827310000005 podStartE2EDuration="9.522258421s" podCreationTimestamp="2025-10-02 18:44:55 +0000 UTC" firstStartedPulling="2025-10-02 18:44:58.799268917 +0000 UTC m=+1619.986764776" lastFinishedPulling="2025-10-02 18:45:03.650544577 +0000 UTC m=+1624.838040466" observedRunningTime="2025-10-02 18:45:04.501752011 +0000 UTC m=+1625.689247870" watchObservedRunningTime="2025-10-02 18:45:04.522258421 +0000 UTC m=+1625.709754280" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.752924 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.843928 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data-custom\") pod \"c60eba96-3123-47ee-a374-50889549cc50\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.844247 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j95qz\" (UniqueName: \"kubernetes.io/projected/c60eba96-3123-47ee-a374-50889549cc50-kube-api-access-j95qz\") pod \"c60eba96-3123-47ee-a374-50889549cc50\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.844422 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data\") pod \"c60eba96-3123-47ee-a374-50889549cc50\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.844628 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-combined-ca-bundle\") pod \"c60eba96-3123-47ee-a374-50889549cc50\" (UID: \"c60eba96-3123-47ee-a374-50889549cc50\") " Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.850049 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c60eba96-3123-47ee-a374-50889549cc50" (UID: "c60eba96-3123-47ee-a374-50889549cc50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.853105 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c60eba96-3123-47ee-a374-50889549cc50-kube-api-access-j95qz" (OuterVolumeSpecName: "kube-api-access-j95qz") pod "c60eba96-3123-47ee-a374-50889549cc50" (UID: "c60eba96-3123-47ee-a374-50889549cc50"). InnerVolumeSpecName "kube-api-access-j95qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.911039 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c60eba96-3123-47ee-a374-50889549cc50" (UID: "c60eba96-3123-47ee-a374-50889549cc50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.936528 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data" (OuterVolumeSpecName: "config-data") pod "c60eba96-3123-47ee-a374-50889549cc50" (UID: "c60eba96-3123-47ee-a374-50889549cc50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.955247 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.955296 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.955314 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c60eba96-3123-47ee-a374-50889549cc50-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:04 crc kubenswrapper[4909]: I1002 18:45:04.955326 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j95qz\" (UniqueName: \"kubernetes.io/projected/c60eba96-3123-47ee-a374-50889549cc50-kube-api-access-j95qz\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.492139 4909 generic.go:334] "Generic (PLEG): container finished" podID="267cf7a5-8352-42ca-be8a-588a26e74159" containerID="86c7b37c85444e7e05ddbb33e8352baf763989e976e2246f51300b7f0ef77ee3" exitCode=0 Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.492208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" event={"ID":"267cf7a5-8352-42ca-be8a-588a26e74159","Type":"ContainerDied","Data":"86c7b37c85444e7e05ddbb33e8352baf763989e976e2246f51300b7f0ef77ee3"} Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.493912 4909 generic.go:334] "Generic (PLEG): container finished" podID="914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" containerID="04dcec55e75c6baf2c55053dd968c5fc5960fec99a4a5909136dc166788b6cc1" exitCode=0 Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.493968 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" event={"ID":"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15","Type":"ContainerDied","Data":"04dcec55e75c6baf2c55053dd968c5fc5960fec99a4a5909136dc166788b6cc1"} Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.496935 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b47684999-l2rd2" Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.499150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b47684999-l2rd2" event={"ID":"c60eba96-3123-47ee-a374-50889549cc50","Type":"ContainerDied","Data":"62119b083aa0aaee85af6a4d407e3d83076abd6d87d1f56fc45da0c08cb77429"} Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.499214 4909 scope.go:117] "RemoveContainer" containerID="66a58827ef94d25ff009efefa5190c17206828f9375e3e7cc377542a45de8b85" Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.639565 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7b47684999-l2rd2"] Oct 02 18:45:05 crc kubenswrapper[4909]: I1002 18:45:05.643195 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7b47684999-l2rd2"] Oct 02 18:45:06 crc kubenswrapper[4909]: I1002 18:45:06.976015 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.110775 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-secret-volume\") pod \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.110979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p9fn\" (UniqueName: \"kubernetes.io/projected/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-kube-api-access-6p9fn\") pod \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.111088 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-config-volume\") pod \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\" (UID: \"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.112072 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-config-volume" (OuterVolumeSpecName: "config-volume") pod "914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" (UID: "914fac7d-d3a1-4cb1-b2d7-8e8821f08e15"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.118288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-kube-api-access-6p9fn" (OuterVolumeSpecName: "kube-api-access-6p9fn") pod "914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" (UID: "914fac7d-d3a1-4cb1-b2d7-8e8821f08e15"). InnerVolumeSpecName "kube-api-access-6p9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.129006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" (UID: "914fac7d-d3a1-4cb1-b2d7-8e8821f08e15"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.213282 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.213323 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p9fn\" (UniqueName: \"kubernetes.io/projected/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-kube-api-access-6p9fn\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.213335 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.268517 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.416388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-ssh-key\") pod \"267cf7a5-8352-42ca-be8a-588a26e74159\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.416458 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlgp\" (UniqueName: \"kubernetes.io/projected/267cf7a5-8352-42ca-be8a-588a26e74159-kube-api-access-xnlgp\") pod \"267cf7a5-8352-42ca-be8a-588a26e74159\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.416595 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-repo-setup-combined-ca-bundle\") pod \"267cf7a5-8352-42ca-be8a-588a26e74159\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.416762 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-inventory\") pod \"267cf7a5-8352-42ca-be8a-588a26e74159\" (UID: \"267cf7a5-8352-42ca-be8a-588a26e74159\") " Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.421279 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267cf7a5-8352-42ca-be8a-588a26e74159-kube-api-access-xnlgp" (OuterVolumeSpecName: "kube-api-access-xnlgp") pod "267cf7a5-8352-42ca-be8a-588a26e74159" (UID: "267cf7a5-8352-42ca-be8a-588a26e74159"). InnerVolumeSpecName "kube-api-access-xnlgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.423125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "267cf7a5-8352-42ca-be8a-588a26e74159" (UID: "267cf7a5-8352-42ca-be8a-588a26e74159"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.445532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-inventory" (OuterVolumeSpecName: "inventory") pod "267cf7a5-8352-42ca-be8a-588a26e74159" (UID: "267cf7a5-8352-42ca-be8a-588a26e74159"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.446064 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "267cf7a5-8352-42ca-be8a-588a26e74159" (UID: "267cf7a5-8352-42ca-be8a-588a26e74159"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.519141 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.519170 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.519180 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlgp\" (UniqueName: \"kubernetes.io/projected/267cf7a5-8352-42ca-be8a-588a26e74159-kube-api-access-xnlgp\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.519191 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267cf7a5-8352-42ca-be8a-588a26e74159-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.527063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" event={"ID":"267cf7a5-8352-42ca-be8a-588a26e74159","Type":"ContainerDied","Data":"ad7b4c8df87c45f9dbbedeffecba14712853c04dea6c9479a87c1ec44b79b455"} Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.527352 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad7b4c8df87c45f9dbbedeffecba14712853c04dea6c9479a87c1ec44b79b455" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.527608 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.540131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" event={"ID":"914fac7d-d3a1-4cb1-b2d7-8e8821f08e15","Type":"ContainerDied","Data":"dbf61fe7680de2589d34f665c256ce3f30340822ccf706df172b4f6eda55e11b"} Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.540182 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf61fe7680de2589d34f665c256ce3f30340822ccf706df172b4f6eda55e11b" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.540253 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.621107 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c60eba96-3123-47ee-a374-50889549cc50" path="/var/lib/kubelet/pods/c60eba96-3123-47ee-a374-50889549cc50/volumes" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.621649 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg"] Oct 02 18:45:07 crc kubenswrapper[4909]: E1002 18:45:07.621957 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60eba96-3123-47ee-a374-50889549cc50" containerName="heat-engine" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.621970 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60eba96-3123-47ee-a374-50889549cc50" containerName="heat-engine" Oct 02 18:45:07 crc kubenswrapper[4909]: E1002 18:45:07.621998 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" containerName="collect-profiles" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.622004 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" containerName="collect-profiles" Oct 02 18:45:07 crc kubenswrapper[4909]: E1002 18:45:07.622014 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267cf7a5-8352-42ca-be8a-588a26e74159" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.622021 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="267cf7a5-8352-42ca-be8a-588a26e74159" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.622276 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" containerName="collect-profiles" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.622299 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c60eba96-3123-47ee-a374-50889549cc50" containerName="heat-engine" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.622309 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="267cf7a5-8352-42ca-be8a-588a26e74159" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.624458 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.627402 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.628702 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.628839 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.628967 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.716002 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg"] Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.729476 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.729542 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8s4k\" (UniqueName: \"kubernetes.io/projected/866bbad8-4176-4a13-9cbe-674edd9c52bb-kube-api-access-w8s4k\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.729612 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.729830 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.832559 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.832748 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.832787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8s4k\" (UniqueName: \"kubernetes.io/projected/866bbad8-4176-4a13-9cbe-674edd9c52bb-kube-api-access-w8s4k\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.832843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.838195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.838911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.839217 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.848520 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8s4k\" (UniqueName: \"kubernetes.io/projected/866bbad8-4176-4a13-9cbe-674edd9c52bb-kube-api-access-w8s4k\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:07 crc kubenswrapper[4909]: I1002 18:45:07.955171 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:45:08 crc kubenswrapper[4909]: I1002 18:45:08.369544 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg"] Oct 02 18:45:08 crc kubenswrapper[4909]: I1002 18:45:08.553314 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" event={"ID":"866bbad8-4176-4a13-9cbe-674edd9c52bb","Type":"ContainerStarted","Data":"d32e3d9c8df6a41cda2e1f984e4341e18dbffec81cd8278183fe14d1bd7b3e1c"} Oct 02 18:45:08 crc kubenswrapper[4909]: I1002 18:45:08.554583 4909 generic.go:334] "Generic (PLEG): container finished" podID="52cdb0d6-c8c2-4646-8350-f63892d098f5" containerID="7273aaf94f67fdc171eda0bb780fcd929bfe9ee12c8227aa61bb9a334adc6bef" exitCode=0 Oct 02 18:45:08 crc kubenswrapper[4909]: I1002 18:45:08.554610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h9bx9" event={"ID":"52cdb0d6-c8c2-4646-8350-f63892d098f5","Type":"ContainerDied","Data":"7273aaf94f67fdc171eda0bb780fcd929bfe9ee12c8227aa61bb9a334adc6bef"} Oct 02 18:45:09 crc kubenswrapper[4909]: I1002 18:45:09.565125 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" event={"ID":"866bbad8-4176-4a13-9cbe-674edd9c52bb","Type":"ContainerStarted","Data":"255972131ad06ab7721d6ced1cebe0a1c754b43b7b1c0070db376c5605dfd772"} Oct 02 18:45:09 crc kubenswrapper[4909]: I1002 18:45:09.597213 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" podStartSLOduration=2.042345793 podStartE2EDuration="2.597194855s" podCreationTimestamp="2025-10-02 18:45:07 +0000 UTC" firstStartedPulling="2025-10-02 18:45:08.430729771 +0000 UTC m=+1629.618225630" lastFinishedPulling="2025-10-02 18:45:08.985578843 +0000 UTC m=+1630.173074692" observedRunningTime="2025-10-02 18:45:09.586073898 +0000 UTC m=+1630.773569827" watchObservedRunningTime="2025-10-02 18:45:09.597194855 +0000 UTC m=+1630.784690714" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.054128 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.201552 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-config-data\") pod \"52cdb0d6-c8c2-4646-8350-f63892d098f5\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.201596 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-scripts\") pod \"52cdb0d6-c8c2-4646-8350-f63892d098f5\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.201670 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjnsc\" (UniqueName: \"kubernetes.io/projected/52cdb0d6-c8c2-4646-8350-f63892d098f5-kube-api-access-tjnsc\") pod \"52cdb0d6-c8c2-4646-8350-f63892d098f5\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.201769 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-combined-ca-bundle\") pod \"52cdb0d6-c8c2-4646-8350-f63892d098f5\" (UID: \"52cdb0d6-c8c2-4646-8350-f63892d098f5\") " Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.206896 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-scripts" (OuterVolumeSpecName: "scripts") pod "52cdb0d6-c8c2-4646-8350-f63892d098f5" (UID: "52cdb0d6-c8c2-4646-8350-f63892d098f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.212046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cdb0d6-c8c2-4646-8350-f63892d098f5-kube-api-access-tjnsc" (OuterVolumeSpecName: "kube-api-access-tjnsc") pod "52cdb0d6-c8c2-4646-8350-f63892d098f5" (UID: "52cdb0d6-c8c2-4646-8350-f63892d098f5"). InnerVolumeSpecName "kube-api-access-tjnsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.232216 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52cdb0d6-c8c2-4646-8350-f63892d098f5" (UID: "52cdb0d6-c8c2-4646-8350-f63892d098f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.237145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-config-data" (OuterVolumeSpecName: "config-data") pod "52cdb0d6-c8c2-4646-8350-f63892d098f5" (UID: "52cdb0d6-c8c2-4646-8350-f63892d098f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.304958 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.305000 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.305012 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjnsc\" (UniqueName: \"kubernetes.io/projected/52cdb0d6-c8c2-4646-8350-f63892d098f5-kube-api-access-tjnsc\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.305034 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cdb0d6-c8c2-4646-8350-f63892d098f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.579648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h9bx9" event={"ID":"52cdb0d6-c8c2-4646-8350-f63892d098f5","Type":"ContainerDied","Data":"97a8028ea3e25fd04b84c14bf3b33aa3e31e7daac9a10f908cea48d1caf58cde"} Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.579694 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a8028ea3e25fd04b84c14bf3b33aa3e31e7daac9a10f908cea48d1caf58cde" Oct 02 18:45:10 crc kubenswrapper[4909]: I1002 18:45:10.579663 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h9bx9" Oct 02 18:45:12 crc kubenswrapper[4909]: I1002 18:45:12.608785 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:45:12 crc kubenswrapper[4909]: E1002 18:45:12.609670 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.392904 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.394139 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-api" containerID="cri-o://47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601" gracePeriod=30 Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.394189 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-listener" containerID="cri-o://0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38" gracePeriod=30 Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.394250 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-notifier" containerID="cri-o://7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd" gracePeriod=30 Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.394338 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-evaluator" containerID="cri-o://79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825" gracePeriod=30 Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.690504 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerID="47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601" exitCode=0 Oct 02 18:45:15 crc kubenswrapper[4909]: I1002 18:45:15.690548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerDied","Data":"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601"} Oct 02 18:45:16 crc kubenswrapper[4909]: I1002 18:45:16.706749 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerID="79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825" exitCode=0 Oct 02 18:45:16 crc kubenswrapper[4909]: I1002 18:45:16.706832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerDied","Data":"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825"} Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.412042 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.501153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-public-tls-certs\") pod \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.501285 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-scripts\") pod \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.501392 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-combined-ca-bundle\") pod \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.501436 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-internal-tls-certs\") pod \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.501467 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg8rp\" (UniqueName: \"kubernetes.io/projected/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-kube-api-access-xg8rp\") pod \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.501509 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-config-data\") pod \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\" (UID: \"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7\") " Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.510271 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-kube-api-access-xg8rp" (OuterVolumeSpecName: "kube-api-access-xg8rp") pod "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" (UID: "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7"). InnerVolumeSpecName "kube-api-access-xg8rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.528562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-scripts" (OuterVolumeSpecName: "scripts") pod "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" (UID: "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.589922 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" (UID: "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.606785 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.607135 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg8rp\" (UniqueName: \"kubernetes.io/projected/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-kube-api-access-xg8rp\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.607220 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.636498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" (UID: "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.709197 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.712151 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" (UID: "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.716277 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-config-data" (OuterVolumeSpecName: "config-data") pod "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" (UID: "ff5a21cf-9a05-439a-8ea5-5569c42bf4d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.724854 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerID="0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38" exitCode=0 Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.724898 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerID="7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd" exitCode=0 Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.724946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerDied","Data":"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38"} Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.724964 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.725108 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerDied","Data":"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd"} Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.725132 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ff5a21cf-9a05-439a-8ea5-5569c42bf4d7","Type":"ContainerDied","Data":"f77258fa0c6be8faf2df03d25ab49d56117ae6cc49e5f1559ba5a282b6f3529d"} Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.725154 4909 scope.go:117] "RemoveContainer" containerID="0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.765992 4909 scope.go:117] "RemoveContainer" containerID="7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.780887 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.795972 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.807211 4909 scope.go:117] "RemoveContainer" containerID="79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.808463 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.809179 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-listener" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809202 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-listener" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.809219 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cdb0d6-c8c2-4646-8350-f63892d098f5" containerName="aodh-db-sync" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809226 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cdb0d6-c8c2-4646-8350-f63892d098f5" containerName="aodh-db-sync" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.809237 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-api" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809244 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-api" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.809252 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-evaluator" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809258 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-evaluator" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.809287 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-notifier" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809293 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-notifier" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809638 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-evaluator" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809658 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-notifier" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809671 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-listener" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809687 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cdb0d6-c8c2-4646-8350-f63892d098f5" containerName="aodh-db-sync" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.809696 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" containerName="aodh-api" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.812154 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.812211 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.812291 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.818703 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-t2tfr" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.819128 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.819258 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.819393 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.819548 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.829735 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.863889 4909 scope.go:117] "RemoveContainer" containerID="47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.890209 4909 scope.go:117] "RemoveContainer" containerID="0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.890788 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38\": container with ID starting with 0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38 not found: ID does not exist" containerID="0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.890830 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38"} err="failed to get container status \"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38\": rpc error: code = NotFound desc = could not find container \"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38\": container with ID starting with 0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38 not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.890852 4909 scope.go:117] "RemoveContainer" containerID="7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.891573 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd\": container with ID starting with 7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd not found: ID does not exist" containerID="7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.891652 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd"} err="failed to get container status \"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd\": rpc error: code = NotFound desc = could not find container \"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd\": container with ID starting with 7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.891719 4909 scope.go:117] "RemoveContainer" containerID="79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.892247 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825\": container with ID starting with 79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825 not found: ID does not exist" containerID="79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.892270 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825"} err="failed to get container status \"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825\": rpc error: code = NotFound desc = could not find container \"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825\": container with ID starting with 79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825 not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.892287 4909 scope.go:117] "RemoveContainer" containerID="47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601" Oct 02 18:45:17 crc kubenswrapper[4909]: E1002 18:45:17.892589 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601\": container with ID starting with 47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601 not found: ID does not exist" containerID="47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.892663 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601"} err="failed to get container status \"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601\": rpc error: code = NotFound desc = could not find container \"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601\": container with ID starting with 47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601 not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.892677 4909 scope.go:117] "RemoveContainer" containerID="0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.892997 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38"} err="failed to get container status \"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38\": rpc error: code = NotFound desc = could not find container \"0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38\": container with ID starting with 0fd4b14b3dee0fb3b1dbb422373d6b53684261940f3501d528e2af417e744e38 not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.893018 4909 scope.go:117] "RemoveContainer" containerID="7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.893290 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd"} err="failed to get container status \"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd\": rpc error: code = NotFound desc = could not find container \"7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd\": container with ID starting with 7822025bdc028b3e5f135248f318481595e986c65c24e9f2bad09a0584d691dd not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.893342 4909 scope.go:117] "RemoveContainer" containerID="79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.893648 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825"} err="failed to get container status \"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825\": rpc error: code = NotFound desc = could not find container \"79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825\": container with ID starting with 79dbbc2c2d24585b6efdcb1208a161b2a677bba5b8a9bce5c12f336f7f5e3825 not found: ID does not exist" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.893714 4909 scope.go:117] "RemoveContainer" containerID="47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601" Oct 02 18:45:17 crc kubenswrapper[4909]: I1002 18:45:17.895397 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601"} err="failed to get container status \"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601\": rpc error: code = NotFound desc = could not find container \"47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601\": container with ID starting with 47b16b0c5a73732d5a3895ab6ccdd4d3475ab10f91500770f3baaaac5196f601 not found: ID does not exist" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.017511 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-public-tls-certs\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.017561 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kg28\" (UniqueName: \"kubernetes.io/projected/325c0e41-6d5b-4920-8d81-5a161eb3189a-kube-api-access-6kg28\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.018595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-internal-tls-certs\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.018658 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.019020 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-config-data\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.019150 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-scripts\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.121269 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-public-tls-certs\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.121321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kg28\" (UniqueName: \"kubernetes.io/projected/325c0e41-6d5b-4920-8d81-5a161eb3189a-kube-api-access-6kg28\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.121363 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-internal-tls-certs\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.121413 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.121504 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-config-data\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.121566 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-scripts\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.126814 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-public-tls-certs\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.130810 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.131085 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-config-data\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.131633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-internal-tls-certs\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.135880 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c0e41-6d5b-4920-8d81-5a161eb3189a-scripts\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.152972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kg28\" (UniqueName: \"kubernetes.io/projected/325c0e41-6d5b-4920-8d81-5a161eb3189a-kube-api-access-6kg28\") pod \"aodh-0\" (UID: \"325c0e41-6d5b-4920-8d81-5a161eb3189a\") " pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.153614 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.707545 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 18:45:18 crc kubenswrapper[4909]: I1002 18:45:18.738020 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"325c0e41-6d5b-4920-8d81-5a161eb3189a","Type":"ContainerStarted","Data":"960a1ba1a8afb618f0d7ce9ac25bcffe7eb7436d53d48023f4881f850845611e"} Oct 02 18:45:19 crc kubenswrapper[4909]: I1002 18:45:19.281909 4909 scope.go:117] "RemoveContainer" containerID="8f7ad0bec3f246f980174fcd5828e9eb8614997292f5de07bfa1aa0b5c83b23e" Oct 02 18:45:19 crc kubenswrapper[4909]: I1002 18:45:19.658974 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5a21cf-9a05-439a-8ea5-5569c42bf4d7" path="/var/lib/kubelet/pods/ff5a21cf-9a05-439a-8ea5-5569c42bf4d7/volumes" Oct 02 18:45:19 crc kubenswrapper[4909]: I1002 18:45:19.751111 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"325c0e41-6d5b-4920-8d81-5a161eb3189a","Type":"ContainerStarted","Data":"ef68dcbde52825f060ba029183b3bbe743093aa9243866a2ee29bf1ab649642d"} Oct 02 18:45:21 crc kubenswrapper[4909]: I1002 18:45:21.774405 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"325c0e41-6d5b-4920-8d81-5a161eb3189a","Type":"ContainerStarted","Data":"cfe4752846393431367718b8ac1d458fac0172c9ed359f54f8df13b6018f5bf2"} Oct 02 18:45:22 crc kubenswrapper[4909]: I1002 18:45:22.788212 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"325c0e41-6d5b-4920-8d81-5a161eb3189a","Type":"ContainerStarted","Data":"2e4f530e1587ad5e2fd30ab5d36ff7164c1cf961e4f4a8409d20ee199273142d"} Oct 02 18:45:23 crc kubenswrapper[4909]: I1002 18:45:23.608357 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:45:23 crc kubenswrapper[4909]: E1002 18:45:23.608813 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:45:24 crc kubenswrapper[4909]: I1002 18:45:24.812174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"325c0e41-6d5b-4920-8d81-5a161eb3189a","Type":"ContainerStarted","Data":"0d60f57e6ee64dd6b3132d65488b0141eecdb063586e7d98837ca8280cfd0886"} Oct 02 18:45:24 crc kubenswrapper[4909]: I1002 18:45:24.841282 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.708812143 podStartE2EDuration="7.841263801s" podCreationTimestamp="2025-10-02 18:45:17 +0000 UTC" firstStartedPulling="2025-10-02 18:45:18.708751067 +0000 UTC m=+1639.896246926" lastFinishedPulling="2025-10-02 18:45:23.841202705 +0000 UTC m=+1645.028698584" observedRunningTime="2025-10-02 18:45:24.832407155 +0000 UTC m=+1646.019903084" watchObservedRunningTime="2025-10-02 18:45:24.841263801 +0000 UTC m=+1646.028759660" Oct 02 18:45:36 crc kubenswrapper[4909]: I1002 18:45:36.609780 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:45:36 crc kubenswrapper[4909]: E1002 18:45:36.611249 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:45:48 crc kubenswrapper[4909]: I1002 18:45:48.608851 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:45:48 crc kubenswrapper[4909]: E1002 18:45:48.610406 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:45:59 crc kubenswrapper[4909]: I1002 18:45:59.643477 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:45:59 crc kubenswrapper[4909]: E1002 18:45:59.645088 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:46:10 crc kubenswrapper[4909]: I1002 18:46:10.608189 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:46:10 crc kubenswrapper[4909]: E1002 18:46:10.609132 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:46:19 crc kubenswrapper[4909]: I1002 18:46:19.576833 4909 scope.go:117] "RemoveContainer" containerID="6cf01c83ac2ad4c8461040d5fb82d2fdf259acd2f9b842d337c2652f0f38d006" Oct 02 18:46:19 crc kubenswrapper[4909]: I1002 18:46:19.600622 4909 scope.go:117] "RemoveContainer" containerID="90f0912ef82d705372af384b0b68d71e34c881e06fb003e6dc5b7df612a74129" Oct 02 18:46:19 crc kubenswrapper[4909]: I1002 18:46:19.638314 4909 scope.go:117] "RemoveContainer" containerID="45e0d6c2580e99c6e480e5d0c9b179166fe8a90bf1e2b479963a6c60c1f3ca3e" Oct 02 18:46:19 crc kubenswrapper[4909]: I1002 18:46:19.688775 4909 scope.go:117] "RemoveContainer" containerID="a4886512dd74e4bf92e75a07d365db5432c367595ddfd519be9f05f6fc8878b8" Oct 02 18:46:23 crc kubenswrapper[4909]: I1002 18:46:23.608136 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:46:23 crc kubenswrapper[4909]: E1002 18:46:23.609175 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.188304 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gm64b"] Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.191085 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.203247 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gm64b"] Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.366851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmhj\" (UniqueName: \"kubernetes.io/projected/0799dbed-f373-42e8-b652-8b2f83148759-kube-api-access-kfmhj\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.366924 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-catalog-content\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.367015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-utilities\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.470086 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmhj\" (UniqueName: \"kubernetes.io/projected/0799dbed-f373-42e8-b652-8b2f83148759-kube-api-access-kfmhj\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.470223 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-catalog-content\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.470331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-utilities\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.470728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-catalog-content\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.470755 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-utilities\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.491532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmhj\" (UniqueName: \"kubernetes.io/projected/0799dbed-f373-42e8-b652-8b2f83148759-kube-api-access-kfmhj\") pod \"redhat-operators-gm64b\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:26 crc kubenswrapper[4909]: I1002 18:46:26.512272 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:27 crc kubenswrapper[4909]: I1002 18:46:27.021972 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gm64b"] Oct 02 18:46:27 crc kubenswrapper[4909]: I1002 18:46:27.702452 4909 generic.go:334] "Generic (PLEG): container finished" podID="0799dbed-f373-42e8-b652-8b2f83148759" containerID="548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226" exitCode=0 Oct 02 18:46:27 crc kubenswrapper[4909]: I1002 18:46:27.702872 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerDied","Data":"548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226"} Oct 02 18:46:27 crc kubenswrapper[4909]: I1002 18:46:27.702959 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerStarted","Data":"99374c790e717e3105af88a98a81a99f5348357ac2748941b92ab6f72f3e9d83"} Oct 02 18:46:29 crc kubenswrapper[4909]: I1002 18:46:29.728035 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerStarted","Data":"b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934"} Oct 02 18:46:34 crc kubenswrapper[4909]: E1002 18:46:34.058802 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0799dbed_f373_42e8_b652_8b2f83148759.slice/crio-conmon-b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934.scope\": RecentStats: unable to find data in memory cache]" Oct 02 18:46:34 crc kubenswrapper[4909]: I1002 18:46:34.813203 4909 generic.go:334] "Generic (PLEG): container finished" podID="0799dbed-f373-42e8-b652-8b2f83148759" containerID="b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934" exitCode=0 Oct 02 18:46:34 crc kubenswrapper[4909]: I1002 18:46:34.813299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerDied","Data":"b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934"} Oct 02 18:46:35 crc kubenswrapper[4909]: I1002 18:46:35.825085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerStarted","Data":"589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6"} Oct 02 18:46:35 crc kubenswrapper[4909]: I1002 18:46:35.856356 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gm64b" podStartSLOduration=2.087478761 podStartE2EDuration="9.856337411s" podCreationTimestamp="2025-10-02 18:46:26 +0000 UTC" firstStartedPulling="2025-10-02 18:46:27.704246499 +0000 UTC m=+1708.891742368" lastFinishedPulling="2025-10-02 18:46:35.473105159 +0000 UTC m=+1716.660601018" observedRunningTime="2025-10-02 18:46:35.848207186 +0000 UTC m=+1717.035703045" watchObservedRunningTime="2025-10-02 18:46:35.856337411 +0000 UTC m=+1717.043833270" Oct 02 18:46:36 crc kubenswrapper[4909]: I1002 18:46:36.513583 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:36 crc kubenswrapper[4909]: I1002 18:46:36.513673 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:37 crc kubenswrapper[4909]: I1002 18:46:37.572123 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gm64b" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="registry-server" probeResult="failure" output=< Oct 02 18:46:37 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:46:37 crc kubenswrapper[4909]: > Oct 02 18:46:37 crc kubenswrapper[4909]: I1002 18:46:37.608984 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:46:37 crc kubenswrapper[4909]: E1002 18:46:37.609356 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:46:46 crc kubenswrapper[4909]: I1002 18:46:46.566433 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:46 crc kubenswrapper[4909]: I1002 18:46:46.657868 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:46 crc kubenswrapper[4909]: I1002 18:46:46.811668 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gm64b"] Oct 02 18:46:47 crc kubenswrapper[4909]: I1002 18:46:47.987996 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gm64b" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="registry-server" containerID="cri-o://589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6" gracePeriod=2 Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.529936 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.608884 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:46:48 crc kubenswrapper[4909]: E1002 18:46:48.609373 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.646722 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-utilities\") pod \"0799dbed-f373-42e8-b652-8b2f83148759\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.646810 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfmhj\" (UniqueName: \"kubernetes.io/projected/0799dbed-f373-42e8-b652-8b2f83148759-kube-api-access-kfmhj\") pod \"0799dbed-f373-42e8-b652-8b2f83148759\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.646935 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-catalog-content\") pod \"0799dbed-f373-42e8-b652-8b2f83148759\" (UID: \"0799dbed-f373-42e8-b652-8b2f83148759\") " Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.648049 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-utilities" (OuterVolumeSpecName: "utilities") pod "0799dbed-f373-42e8-b652-8b2f83148759" (UID: "0799dbed-f373-42e8-b652-8b2f83148759"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.654743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0799dbed-f373-42e8-b652-8b2f83148759-kube-api-access-kfmhj" (OuterVolumeSpecName: "kube-api-access-kfmhj") pod "0799dbed-f373-42e8-b652-8b2f83148759" (UID: "0799dbed-f373-42e8-b652-8b2f83148759"). InnerVolumeSpecName "kube-api-access-kfmhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.730128 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0799dbed-f373-42e8-b652-8b2f83148759" (UID: "0799dbed-f373-42e8-b652-8b2f83148759"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.751478 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.751504 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfmhj\" (UniqueName: \"kubernetes.io/projected/0799dbed-f373-42e8-b652-8b2f83148759-kube-api-access-kfmhj\") on node \"crc\" DevicePath \"\"" Oct 02 18:46:48 crc kubenswrapper[4909]: I1002 18:46:48.751517 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0799dbed-f373-42e8-b652-8b2f83148759-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.004070 4909 generic.go:334] "Generic (PLEG): container finished" podID="0799dbed-f373-42e8-b652-8b2f83148759" containerID="589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6" exitCode=0 Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.004160 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerDied","Data":"589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6"} Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.005869 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gm64b" event={"ID":"0799dbed-f373-42e8-b652-8b2f83148759","Type":"ContainerDied","Data":"99374c790e717e3105af88a98a81a99f5348357ac2748941b92ab6f72f3e9d83"} Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.004179 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gm64b" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.005914 4909 scope.go:117] "RemoveContainer" containerID="589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.034245 4909 scope.go:117] "RemoveContainer" containerID="b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.059819 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gm64b"] Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.069206 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gm64b"] Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.088085 4909 scope.go:117] "RemoveContainer" containerID="548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.135974 4909 scope.go:117] "RemoveContainer" containerID="589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6" Oct 02 18:46:49 crc kubenswrapper[4909]: E1002 18:46:49.136646 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6\": container with ID starting with 589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6 not found: ID does not exist" containerID="589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.136708 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6"} err="failed to get container status \"589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6\": rpc error: code = NotFound desc = could not find container \"589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6\": container with ID starting with 589a65f8b86658c2632c9a409232c2fb13d83d786c9198179bd1a89059c5b9f6 not found: ID does not exist" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.136752 4909 scope.go:117] "RemoveContainer" containerID="b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934" Oct 02 18:46:49 crc kubenswrapper[4909]: E1002 18:46:49.137462 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934\": container with ID starting with b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934 not found: ID does not exist" containerID="b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.137532 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934"} err="failed to get container status \"b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934\": rpc error: code = NotFound desc = could not find container \"b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934\": container with ID starting with b11fb4be52988d5c2ee2aa48e3e6c0ede9cf74c0088e2691a4b0b86b1c7f0934 not found: ID does not exist" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.137601 4909 scope.go:117] "RemoveContainer" containerID="548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226" Oct 02 18:46:49 crc kubenswrapper[4909]: E1002 18:46:49.143205 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226\": container with ID starting with 548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226 not found: ID does not exist" containerID="548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.143243 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226"} err="failed to get container status \"548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226\": rpc error: code = NotFound desc = could not find container \"548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226\": container with ID starting with 548fd787eb1be521c5313481498b9750ba12149c9afe18e13a302482d6a5b226 not found: ID does not exist" Oct 02 18:46:49 crc kubenswrapper[4909]: I1002 18:46:49.622281 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0799dbed-f373-42e8-b652-8b2f83148759" path="/var/lib/kubelet/pods/0799dbed-f373-42e8-b652-8b2f83148759/volumes" Oct 02 18:47:03 crc kubenswrapper[4909]: I1002 18:47:03.609894 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:47:03 crc kubenswrapper[4909]: E1002 18:47:03.611089 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:47:15 crc kubenswrapper[4909]: I1002 18:47:15.608750 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:47:15 crc kubenswrapper[4909]: E1002 18:47:15.609541 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:47:28 crc kubenswrapper[4909]: I1002 18:47:28.609207 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:47:28 crc kubenswrapper[4909]: E1002 18:47:28.611051 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:47:41 crc kubenswrapper[4909]: I1002 18:47:41.612818 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:47:41 crc kubenswrapper[4909]: E1002 18:47:41.613912 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:47:55 crc kubenswrapper[4909]: I1002 18:47:55.608961 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:47:55 crc kubenswrapper[4909]: E1002 18:47:55.610253 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:48:09 crc kubenswrapper[4909]: I1002 18:48:09.619719 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:48:09 crc kubenswrapper[4909]: E1002 18:48:09.620734 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:48:19 crc kubenswrapper[4909]: I1002 18:48:19.958089 4909 scope.go:117] "RemoveContainer" containerID="18fd289cab681d1b85aee0fa477341f650d2ec2921c3848b71442c545bec6afb" Oct 02 18:48:19 crc kubenswrapper[4909]: I1002 18:48:19.993371 4909 scope.go:117] "RemoveContainer" containerID="3b721b6867628d8c4e9bdadd2a0ffa1f6945806ebf39515b395ba0fb4c1ab48d" Oct 02 18:48:20 crc kubenswrapper[4909]: I1002 18:48:20.033247 4909 scope.go:117] "RemoveContainer" containerID="c6a801bc60f11ab0c0662b7f78d788419233f8d0754312bb33cdee33c2f0185e" Oct 02 18:48:20 crc kubenswrapper[4909]: I1002 18:48:20.092275 4909 scope.go:117] "RemoveContainer" containerID="7b3f29e1867c04993e0899074e0472781c29e2189fea3c4f84d4edf49c5b8948" Oct 02 18:48:20 crc kubenswrapper[4909]: I1002 18:48:20.141992 4909 scope.go:117] "RemoveContainer" containerID="ba435c93015d0d32f1ba47dc65036de355fb859da9b8d9f5b78d8c4e6f712460" Oct 02 18:48:21 crc kubenswrapper[4909]: I1002 18:48:21.611802 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:48:21 crc kubenswrapper[4909]: E1002 18:48:21.612236 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:48:24 crc kubenswrapper[4909]: I1002 18:48:24.207452 4909 generic.go:334] "Generic (PLEG): container finished" podID="866bbad8-4176-4a13-9cbe-674edd9c52bb" containerID="255972131ad06ab7721d6ced1cebe0a1c754b43b7b1c0070db376c5605dfd772" exitCode=0 Oct 02 18:48:24 crc kubenswrapper[4909]: I1002 18:48:24.207558 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" event={"ID":"866bbad8-4176-4a13-9cbe-674edd9c52bb","Type":"ContainerDied","Data":"255972131ad06ab7721d6ced1cebe0a1c754b43b7b1c0070db376c5605dfd772"} Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.669162 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.699704 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-bootstrap-combined-ca-bundle\") pod \"866bbad8-4176-4a13-9cbe-674edd9c52bb\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.699787 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-ssh-key\") pod \"866bbad8-4176-4a13-9cbe-674edd9c52bb\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.699821 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-inventory\") pod \"866bbad8-4176-4a13-9cbe-674edd9c52bb\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.699869 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8s4k\" (UniqueName: \"kubernetes.io/projected/866bbad8-4176-4a13-9cbe-674edd9c52bb-kube-api-access-w8s4k\") pod \"866bbad8-4176-4a13-9cbe-674edd9c52bb\" (UID: \"866bbad8-4176-4a13-9cbe-674edd9c52bb\") " Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.731293 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866bbad8-4176-4a13-9cbe-674edd9c52bb-kube-api-access-w8s4k" (OuterVolumeSpecName: "kube-api-access-w8s4k") pod "866bbad8-4176-4a13-9cbe-674edd9c52bb" (UID: "866bbad8-4176-4a13-9cbe-674edd9c52bb"). InnerVolumeSpecName "kube-api-access-w8s4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.733582 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "866bbad8-4176-4a13-9cbe-674edd9c52bb" (UID: "866bbad8-4176-4a13-9cbe-674edd9c52bb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.761874 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "866bbad8-4176-4a13-9cbe-674edd9c52bb" (UID: "866bbad8-4176-4a13-9cbe-674edd9c52bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.762447 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-inventory" (OuterVolumeSpecName: "inventory") pod "866bbad8-4176-4a13-9cbe-674edd9c52bb" (UID: "866bbad8-4176-4a13-9cbe-674edd9c52bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.802476 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.802535 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.802553 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bbad8-4176-4a13-9cbe-674edd9c52bb-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:48:25 crc kubenswrapper[4909]: I1002 18:48:25.802565 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8s4k\" (UniqueName: \"kubernetes.io/projected/866bbad8-4176-4a13-9cbe-674edd9c52bb-kube-api-access-w8s4k\") on node \"crc\" DevicePath \"\"" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.232433 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" event={"ID":"866bbad8-4176-4a13-9cbe-674edd9c52bb","Type":"ContainerDied","Data":"d32e3d9c8df6a41cda2e1f984e4341e18dbffec81cd8278183fe14d1bd7b3e1c"} Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.232774 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d32e3d9c8df6a41cda2e1f984e4341e18dbffec81cd8278183fe14d1bd7b3e1c" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.232480 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328098 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr"] Oct 02 18:48:26 crc kubenswrapper[4909]: E1002 18:48:26.328531 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="extract-utilities" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328548 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="extract-utilities" Oct 02 18:48:26 crc kubenswrapper[4909]: E1002 18:48:26.328561 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="extract-content" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328568 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="extract-content" Oct 02 18:48:26 crc kubenswrapper[4909]: E1002 18:48:26.328590 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866bbad8-4176-4a13-9cbe-674edd9c52bb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328597 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="866bbad8-4176-4a13-9cbe-674edd9c52bb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 18:48:26 crc kubenswrapper[4909]: E1002 18:48:26.328616 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="registry-server" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328622 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="registry-server" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328805 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="866bbad8-4176-4a13-9cbe-674edd9c52bb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.328827 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0799dbed-f373-42e8-b652-8b2f83148759" containerName="registry-server" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.329596 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.338821 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr"] Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.339260 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.339475 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.339601 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.339715 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.438135 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.438348 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxp6\" (UniqueName: \"kubernetes.io/projected/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-kube-api-access-bdxp6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.438987 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.541579 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.541680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxp6\" (UniqueName: \"kubernetes.io/projected/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-kube-api-access-bdxp6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.541731 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.548080 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.552832 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.565618 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxp6\" (UniqueName: \"kubernetes.io/projected/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-kube-api-access-bdxp6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:26 crc kubenswrapper[4909]: I1002 18:48:26.657753 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:48:27 crc kubenswrapper[4909]: I1002 18:48:27.203155 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr"] Oct 02 18:48:27 crc kubenswrapper[4909]: I1002 18:48:27.243178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" event={"ID":"df963eb6-4e59-4dc7-ab3f-dbb8459276a2","Type":"ContainerStarted","Data":"ee18fa76b9be020b2309c17905cd4fa460577422ee9f6317f48823f8e80af8bb"} Oct 02 18:48:29 crc kubenswrapper[4909]: I1002 18:48:29.268174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" event={"ID":"df963eb6-4e59-4dc7-ab3f-dbb8459276a2","Type":"ContainerStarted","Data":"e848759f97d2b6a92addc5dee4821498af6061fb632a26e5e5393c1b9dcfa374"} Oct 02 18:48:29 crc kubenswrapper[4909]: I1002 18:48:29.292887 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" podStartSLOduration=2.431813104 podStartE2EDuration="3.292860285s" podCreationTimestamp="2025-10-02 18:48:26 +0000 UTC" firstStartedPulling="2025-10-02 18:48:27.203084768 +0000 UTC m=+1828.390580637" lastFinishedPulling="2025-10-02 18:48:28.064131939 +0000 UTC m=+1829.251627818" observedRunningTime="2025-10-02 18:48:29.288169318 +0000 UTC m=+1830.475665197" watchObservedRunningTime="2025-10-02 18:48:29.292860285 +0000 UTC m=+1830.480356154" Oct 02 18:48:36 crc kubenswrapper[4909]: I1002 18:48:36.609386 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:48:36 crc kubenswrapper[4909]: E1002 18:48:36.610583 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:48:38 crc kubenswrapper[4909]: I1002 18:48:38.071346 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5dqlm"] Oct 02 18:48:38 crc kubenswrapper[4909]: I1002 18:48:38.080646 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6z9g5"] Oct 02 18:48:38 crc kubenswrapper[4909]: I1002 18:48:38.091277 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5dqlm"] Oct 02 18:48:38 crc kubenswrapper[4909]: I1002 18:48:38.099940 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6z9g5"] Oct 02 18:48:39 crc kubenswrapper[4909]: I1002 18:48:39.643339 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225" path="/var/lib/kubelet/pods/85d4ccd2-92a9-4408-b2d9-dc8c1a2d9225/volumes" Oct 02 18:48:39 crc kubenswrapper[4909]: I1002 18:48:39.646486 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97095a9f-e6b9-49d3-8fd5-684e40e9d6b1" path="/var/lib/kubelet/pods/97095a9f-e6b9-49d3-8fd5-684e40e9d6b1/volumes" Oct 02 18:48:41 crc kubenswrapper[4909]: I1002 18:48:41.041570 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kvwsj"] Oct 02 18:48:41 crc kubenswrapper[4909]: I1002 18:48:41.057991 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kvwsj"] Oct 02 18:48:41 crc kubenswrapper[4909]: I1002 18:48:41.632918 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c24f10a-b08b-41f8-a71c-c406f531ea88" path="/var/lib/kubelet/pods/8c24f10a-b08b-41f8-a71c-c406f531ea88/volumes" Oct 02 18:48:44 crc kubenswrapper[4909]: I1002 18:48:44.040104 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-kc8f6"] Oct 02 18:48:44 crc kubenswrapper[4909]: I1002 18:48:44.059332 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-kc8f6"] Oct 02 18:48:45 crc kubenswrapper[4909]: I1002 18:48:45.626864 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e0572e-0def-4e7e-958d-8272391b538d" path="/var/lib/kubelet/pods/11e0572e-0def-4e7e-958d-8272391b538d/volumes" Oct 02 18:48:47 crc kubenswrapper[4909]: I1002 18:48:47.040073 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-4f3e-account-create-gdbfn"] Oct 02 18:48:47 crc kubenswrapper[4909]: I1002 18:48:47.060950 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-4f3e-account-create-gdbfn"] Oct 02 18:48:47 crc kubenswrapper[4909]: I1002 18:48:47.620863 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff1a7f2-7ef1-401d-9597-52682290af50" path="/var/lib/kubelet/pods/8ff1a7f2-7ef1-401d-9597-52682290af50/volumes" Oct 02 18:48:49 crc kubenswrapper[4909]: I1002 18:48:49.631725 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:48:49 crc kubenswrapper[4909]: E1002 18:48:49.632676 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:48:52 crc kubenswrapper[4909]: I1002 18:48:52.069494 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-v55k7"] Oct 02 18:48:52 crc kubenswrapper[4909]: I1002 18:48:52.088270 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-v55k7"] Oct 02 18:48:53 crc kubenswrapper[4909]: I1002 18:48:53.630750 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed2698b-26a9-4151-b915-ca44218b56b0" path="/var/lib/kubelet/pods/7ed2698b-26a9-4151-b915-ca44218b56b0/volumes" Oct 02 18:48:54 crc kubenswrapper[4909]: I1002 18:48:54.051201 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-adab-account-create-gtlhs"] Oct 02 18:48:54 crc kubenswrapper[4909]: I1002 18:48:54.066407 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0028-account-create-72r8j"] Oct 02 18:48:54 crc kubenswrapper[4909]: I1002 18:48:54.079994 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-adab-account-create-gtlhs"] Oct 02 18:48:54 crc kubenswrapper[4909]: I1002 18:48:54.092296 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0028-account-create-72r8j"] Oct 02 18:48:55 crc kubenswrapper[4909]: I1002 18:48:55.632479 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5aea08-beac-4fe9-adcf-7db40c37bbba" path="/var/lib/kubelet/pods/bd5aea08-beac-4fe9-adcf-7db40c37bbba/volumes" Oct 02 18:48:55 crc kubenswrapper[4909]: I1002 18:48:55.638293 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8724c33-84bd-4420-a55c-b55ae1b7484a" path="/var/lib/kubelet/pods/d8724c33-84bd-4420-a55c-b55ae1b7484a/volumes" Oct 02 18:49:00 crc kubenswrapper[4909]: I1002 18:49:00.064126 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e2fa-account-create-7lx7n"] Oct 02 18:49:00 crc kubenswrapper[4909]: I1002 18:49:00.085697 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e2fa-account-create-7lx7n"] Oct 02 18:49:01 crc kubenswrapper[4909]: I1002 18:49:01.609684 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:49:01 crc kubenswrapper[4909]: E1002 18:49:01.610227 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:49:01 crc kubenswrapper[4909]: I1002 18:49:01.639090 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9dcfa1-b647-423c-9a3c-0a5f19d133cb" path="/var/lib/kubelet/pods/6f9dcfa1-b647-423c-9a3c-0a5f19d133cb/volumes" Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.055543 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-krg67"] Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.070078 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ftndn"] Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.083298 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-krg67"] Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.093808 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ftndn"] Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.104648 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-l9hfg"] Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.112546 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-l9hfg"] Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.631252 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0965a26d-8f10-4685-89be-50e446307d30" path="/var/lib/kubelet/pods/0965a26d-8f10-4685-89be-50e446307d30/volumes" Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.632170 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c112c9e-1a94-4ff4-8b92-c712a835851c" path="/var/lib/kubelet/pods/2c112c9e-1a94-4ff4-8b92-c712a835851c/volumes" Oct 02 18:49:03 crc kubenswrapper[4909]: I1002 18:49:03.632820 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a70c14f-d78d-4190-b731-68d3b606796d" path="/var/lib/kubelet/pods/3a70c14f-d78d-4190-b731-68d3b606796d/volumes" Oct 02 18:49:04 crc kubenswrapper[4909]: I1002 18:49:04.038144 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-c778-account-create-2slnp"] Oct 02 18:49:04 crc kubenswrapper[4909]: I1002 18:49:04.049402 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-z6bdb"] Oct 02 18:49:04 crc kubenswrapper[4909]: I1002 18:49:04.060142 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-c778-account-create-2slnp"] Oct 02 18:49:04 crc kubenswrapper[4909]: I1002 18:49:04.070472 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-z6bdb"] Oct 02 18:49:05 crc kubenswrapper[4909]: I1002 18:49:05.627776 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="191d5a8e-d6cd-4538-9dde-2493ed061b25" path="/var/lib/kubelet/pods/191d5a8e-d6cd-4538-9dde-2493ed061b25/volumes" Oct 02 18:49:05 crc kubenswrapper[4909]: I1002 18:49:05.628384 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2616ad31-8b46-48ca-a3db-8e4f54596ae6" path="/var/lib/kubelet/pods/2616ad31-8b46-48ca-a3db-8e4f54596ae6/volumes" Oct 02 18:49:16 crc kubenswrapper[4909]: I1002 18:49:16.609322 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:49:16 crc kubenswrapper[4909]: E1002 18:49:16.610583 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.243553 4909 scope.go:117] "RemoveContainer" containerID="95bfc97d48c42ed508903f2ed8edb44398bec17e6f67a09adbd85c2c940b2b7b" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.288582 4909 scope.go:117] "RemoveContainer" containerID="0604c2babd8d145784807c52844338898fff93fc3dd5c7547f85a4641156a251" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.371865 4909 scope.go:117] "RemoveContainer" containerID="495a8408b458b80df6bc9eeb217efd129e4ab1e7f081a946ed717dce3570afe6" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.411677 4909 scope.go:117] "RemoveContainer" containerID="69dde16e06e8bb96685e09b5e31cf359492b6ea95a0e979552adcadfbce9b2b9" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.459343 4909 scope.go:117] "RemoveContainer" containerID="1b20789f7f0f58cf5f2d9b820d017318853573b3a27353e775066e5665ebe140" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.512518 4909 scope.go:117] "RemoveContainer" containerID="8a8e6350ee663fd86448714a6ffb70f643799b988372537e36489ec6a987c4ea" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.568348 4909 scope.go:117] "RemoveContainer" containerID="b6a1d19b9e9b07e6d125ce702b321533ea946c61ba2d7a8c025ff5343d287de4" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.609153 4909 scope.go:117] "RemoveContainer" containerID="9cbe2e21eeaf63627a116a9e79a150cf7f47ce05a41196f3e8b5dad8d39d30f6" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.644511 4909 scope.go:117] "RemoveContainer" containerID="7176712659b2dbe104d4276a94c5f15329da91bdd2a81e5c0998c73fbac5a4fb" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.674047 4909 scope.go:117] "RemoveContainer" containerID="3ed746a889a3acf435b0fc46a4e71b8bb2db04afd25d11a2c36fff1bd4270970" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.720764 4909 scope.go:117] "RemoveContainer" containerID="d9565192d447669b3876c887730b5767a1d3e7af2bab5bab1cc7be7e2cb5b27f" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.758635 4909 scope.go:117] "RemoveContainer" containerID="ffad82f390602ea39906e575b6f7751d3cdf83e940f8dc88b2189457bb9f295f" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.791096 4909 scope.go:117] "RemoveContainer" containerID="f51ffe3c0aed78cee8fcf6f09387b423ed2ab81c715f72dd09bcbf02c5606446" Oct 02 18:49:20 crc kubenswrapper[4909]: I1002 18:49:20.818520 4909 scope.go:117] "RemoveContainer" containerID="cd8cdd4c042b48dde2686b51942acd02e64bacdd9a2e8fbb69cd0e49d9728949" Oct 02 18:49:21 crc kubenswrapper[4909]: I1002 18:49:21.048203 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-3859-account-create-wvjxj"] Oct 02 18:49:21 crc kubenswrapper[4909]: I1002 18:49:21.064801 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d8c6-account-create-582jx"] Oct 02 18:49:21 crc kubenswrapper[4909]: I1002 18:49:21.080387 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d8c6-account-create-582jx"] Oct 02 18:49:21 crc kubenswrapper[4909]: I1002 18:49:21.092434 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-3859-account-create-wvjxj"] Oct 02 18:49:21 crc kubenswrapper[4909]: I1002 18:49:21.625492 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="191bbde0-1bc0-418f-ab35-5abe18caa0b1" path="/var/lib/kubelet/pods/191bbde0-1bc0-418f-ab35-5abe18caa0b1/volumes" Oct 02 18:49:21 crc kubenswrapper[4909]: I1002 18:49:21.626758 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e7fb0ec-f6be-437c-a9c6-98e50de69047" path="/var/lib/kubelet/pods/5e7fb0ec-f6be-437c-a9c6-98e50de69047/volumes" Oct 02 18:49:23 crc kubenswrapper[4909]: I1002 18:49:23.032091 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0c9a-account-create-g2zs4"] Oct 02 18:49:23 crc kubenswrapper[4909]: I1002 18:49:23.043284 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1af5-account-create-jw8kh"] Oct 02 18:49:23 crc kubenswrapper[4909]: I1002 18:49:23.053376 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1af5-account-create-jw8kh"] Oct 02 18:49:23 crc kubenswrapper[4909]: I1002 18:49:23.061603 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0c9a-account-create-g2zs4"] Oct 02 18:49:23 crc kubenswrapper[4909]: I1002 18:49:23.623656 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b659a4c7-1d4e-432b-b759-b446761f985c" path="/var/lib/kubelet/pods/b659a4c7-1d4e-432b-b759-b446761f985c/volumes" Oct 02 18:49:23 crc kubenswrapper[4909]: I1002 18:49:23.625385 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d953baa8-f150-45e6-9949-766017b28d8d" path="/var/lib/kubelet/pods/d953baa8-f150-45e6-9949-766017b28d8d/volumes" Oct 02 18:49:24 crc kubenswrapper[4909]: I1002 18:49:24.050634 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cb4f8"] Oct 02 18:49:24 crc kubenswrapper[4909]: I1002 18:49:24.061220 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cb4f8"] Oct 02 18:49:25 crc kubenswrapper[4909]: I1002 18:49:25.627263 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9483cd24-6cfe-48cb-a812-8059a42cb41e" path="/var/lib/kubelet/pods/9483cd24-6cfe-48cb-a812-8059a42cb41e/volumes" Oct 02 18:49:29 crc kubenswrapper[4909]: I1002 18:49:29.614361 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:49:30 crc kubenswrapper[4909]: I1002 18:49:30.127642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"a91abf9faef5ed8796c146ef7a0e1bf6166a28a8a51bea1801fb3fe8045d6573"} Oct 02 18:49:40 crc kubenswrapper[4909]: I1002 18:49:40.045697 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-m25jj"] Oct 02 18:49:40 crc kubenswrapper[4909]: I1002 18:49:40.055929 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-m25jj"] Oct 02 18:49:41 crc kubenswrapper[4909]: I1002 18:49:41.637943 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2671433b-366d-48ec-90d4-28a4ed3cece9" path="/var/lib/kubelet/pods/2671433b-366d-48ec-90d4-28a4ed3cece9/volumes" Oct 02 18:49:47 crc kubenswrapper[4909]: I1002 18:49:47.322603 4909 generic.go:334] "Generic (PLEG): container finished" podID="df963eb6-4e59-4dc7-ab3f-dbb8459276a2" containerID="e848759f97d2b6a92addc5dee4821498af6061fb632a26e5e5393c1b9dcfa374" exitCode=0 Oct 02 18:49:47 crc kubenswrapper[4909]: I1002 18:49:47.322719 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" event={"ID":"df963eb6-4e59-4dc7-ab3f-dbb8459276a2","Type":"ContainerDied","Data":"e848759f97d2b6a92addc5dee4821498af6061fb632a26e5e5393c1b9dcfa374"} Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.795047 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.800387 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-inventory\") pod \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.800690 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-ssh-key\") pod \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.800785 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdxp6\" (UniqueName: \"kubernetes.io/projected/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-kube-api-access-bdxp6\") pod \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\" (UID: \"df963eb6-4e59-4dc7-ab3f-dbb8459276a2\") " Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.807182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-kube-api-access-bdxp6" (OuterVolumeSpecName: "kube-api-access-bdxp6") pod "df963eb6-4e59-4dc7-ab3f-dbb8459276a2" (UID: "df963eb6-4e59-4dc7-ab3f-dbb8459276a2"). InnerVolumeSpecName "kube-api-access-bdxp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.834111 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-inventory" (OuterVolumeSpecName: "inventory") pod "df963eb6-4e59-4dc7-ab3f-dbb8459276a2" (UID: "df963eb6-4e59-4dc7-ab3f-dbb8459276a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.842805 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "df963eb6-4e59-4dc7-ab3f-dbb8459276a2" (UID: "df963eb6-4e59-4dc7-ab3f-dbb8459276a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.902697 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.902727 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdxp6\" (UniqueName: \"kubernetes.io/projected/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-kube-api-access-bdxp6\") on node \"crc\" DevicePath \"\"" Oct 02 18:49:48 crc kubenswrapper[4909]: I1002 18:49:48.902737 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df963eb6-4e59-4dc7-ab3f-dbb8459276a2-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.361255 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" event={"ID":"df963eb6-4e59-4dc7-ab3f-dbb8459276a2","Type":"ContainerDied","Data":"ee18fa76b9be020b2309c17905cd4fa460577422ee9f6317f48823f8e80af8bb"} Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.361700 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee18fa76b9be020b2309c17905cd4fa460577422ee9f6317f48823f8e80af8bb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.361385 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.454990 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb"] Oct 02 18:49:49 crc kubenswrapper[4909]: E1002 18:49:49.455546 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df963eb6-4e59-4dc7-ab3f-dbb8459276a2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.455568 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="df963eb6-4e59-4dc7-ab3f-dbb8459276a2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.455823 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="df963eb6-4e59-4dc7-ab3f-dbb8459276a2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.456705 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.460978 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.461216 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.461330 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.461436 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.463974 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb"] Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.620631 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.620802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.620972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nh4w\" (UniqueName: \"kubernetes.io/projected/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-kube-api-access-4nh4w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.722904 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nh4w\" (UniqueName: \"kubernetes.io/projected/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-kube-api-access-4nh4w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.724285 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.724800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.729237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.730536 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.745664 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nh4w\" (UniqueName: \"kubernetes.io/projected/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-kube-api-access-4nh4w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-slmwb\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:49 crc kubenswrapper[4909]: I1002 18:49:49.809675 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:50 crc kubenswrapper[4909]: I1002 18:49:50.374725 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb"] Oct 02 18:49:50 crc kubenswrapper[4909]: I1002 18:49:50.388803 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:49:51 crc kubenswrapper[4909]: I1002 18:49:51.397300 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" event={"ID":"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b","Type":"ContainerStarted","Data":"6f62e4484fed2f37efe6e943960688f58f15033ed320adca8f1760bd030b469f"} Oct 02 18:49:51 crc kubenswrapper[4909]: I1002 18:49:51.397373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" event={"ID":"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b","Type":"ContainerStarted","Data":"4f5fdfb87e6eed41b9d188f24aac3260bd291daee6976f3e7235a259649b0aa4"} Oct 02 18:49:51 crc kubenswrapper[4909]: I1002 18:49:51.416573 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" podStartSLOduration=1.953726879 podStartE2EDuration="2.416543721s" podCreationTimestamp="2025-10-02 18:49:49 +0000 UTC" firstStartedPulling="2025-10-02 18:49:50.38838292 +0000 UTC m=+1911.575878819" lastFinishedPulling="2025-10-02 18:49:50.851199802 +0000 UTC m=+1912.038695661" observedRunningTime="2025-10-02 18:49:51.414453576 +0000 UTC m=+1912.601949465" watchObservedRunningTime="2025-10-02 18:49:51.416543721 +0000 UTC m=+1912.604039610" Oct 02 18:49:54 crc kubenswrapper[4909]: I1002 18:49:54.043983 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-xldr9"] Oct 02 18:49:54 crc kubenswrapper[4909]: I1002 18:49:54.059184 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-xldr9"] Oct 02 18:49:55 crc kubenswrapper[4909]: I1002 18:49:55.623430 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20be3d30-b225-41d3-a951-790a9b22d89c" path="/var/lib/kubelet/pods/20be3d30-b225-41d3-a951-790a9b22d89c/volumes" Oct 02 18:49:57 crc kubenswrapper[4909]: I1002 18:49:57.476751 4909 generic.go:334] "Generic (PLEG): container finished" podID="d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" containerID="6f62e4484fed2f37efe6e943960688f58f15033ed320adca8f1760bd030b469f" exitCode=0 Oct 02 18:49:57 crc kubenswrapper[4909]: I1002 18:49:57.476869 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" event={"ID":"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b","Type":"ContainerDied","Data":"6f62e4484fed2f37efe6e943960688f58f15033ed320adca8f1760bd030b469f"} Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.009125 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.133816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-inventory\") pod \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.133903 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-ssh-key\") pod \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.133966 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nh4w\" (UniqueName: \"kubernetes.io/projected/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-kube-api-access-4nh4w\") pod \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\" (UID: \"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b\") " Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.139201 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-kube-api-access-4nh4w" (OuterVolumeSpecName: "kube-api-access-4nh4w") pod "d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" (UID: "d3b880f8-829a-4ad8-b7a0-146e92ea1a4b"). InnerVolumeSpecName "kube-api-access-4nh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.195174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" (UID: "d3b880f8-829a-4ad8-b7a0-146e92ea1a4b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.196164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-inventory" (OuterVolumeSpecName: "inventory") pod "d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" (UID: "d3b880f8-829a-4ad8-b7a0-146e92ea1a4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.237513 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.237550 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.237558 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nh4w\" (UniqueName: \"kubernetes.io/projected/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b-kube-api-access-4nh4w\") on node \"crc\" DevicePath \"\"" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.502583 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" event={"ID":"d3b880f8-829a-4ad8-b7a0-146e92ea1a4b","Type":"ContainerDied","Data":"4f5fdfb87e6eed41b9d188f24aac3260bd291daee6976f3e7235a259649b0aa4"} Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.502923 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f5fdfb87e6eed41b9d188f24aac3260bd291daee6976f3e7235a259649b0aa4" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.502655 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.581515 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2"] Oct 02 18:49:59 crc kubenswrapper[4909]: E1002 18:49:59.582040 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.582061 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.582295 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.583238 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.585581 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.585705 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.585708 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.591083 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.596871 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2"] Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.644089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.644237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.644368 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vk8c\" (UniqueName: \"kubernetes.io/projected/acdf279c-a962-420a-b005-b5a736105600-kube-api-access-6vk8c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.746553 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.746654 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.746789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vk8c\" (UniqueName: \"kubernetes.io/projected/acdf279c-a962-420a-b005-b5a736105600-kube-api-access-6vk8c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.752052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.763417 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.783230 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vk8c\" (UniqueName: \"kubernetes.io/projected/acdf279c-a962-420a-b005-b5a736105600-kube-api-access-6vk8c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsnx2\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:49:59 crc kubenswrapper[4909]: I1002 18:49:59.943502 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:50:00 crc kubenswrapper[4909]: I1002 18:50:00.535930 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2"] Oct 02 18:50:01 crc kubenswrapper[4909]: I1002 18:50:01.528452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" event={"ID":"acdf279c-a962-420a-b005-b5a736105600","Type":"ContainerStarted","Data":"5c6dc98c80e4041821f1b6ef0350d30ccf53df97b1f1284dc701f6de61bbf109"} Oct 02 18:50:02 crc kubenswrapper[4909]: I1002 18:50:02.541847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" event={"ID":"acdf279c-a962-420a-b005-b5a736105600","Type":"ContainerStarted","Data":"00af8b52e14541e0304fd45dbe7b1527aa154a1fcd5166d39f0c4c73bfa507de"} Oct 02 18:50:02 crc kubenswrapper[4909]: I1002 18:50:02.580356 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" podStartSLOduration=2.793095009 podStartE2EDuration="3.580331609s" podCreationTimestamp="2025-10-02 18:49:59 +0000 UTC" firstStartedPulling="2025-10-02 18:50:00.551622506 +0000 UTC m=+1921.739118375" lastFinishedPulling="2025-10-02 18:50:01.338859076 +0000 UTC m=+1922.526354975" observedRunningTime="2025-10-02 18:50:02.555435012 +0000 UTC m=+1923.742930911" watchObservedRunningTime="2025-10-02 18:50:02.580331609 +0000 UTC m=+1923.767827478" Oct 02 18:50:10 crc kubenswrapper[4909]: I1002 18:50:10.045690 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zw6nz"] Oct 02 18:50:10 crc kubenswrapper[4909]: I1002 18:50:10.064568 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rcr26"] Oct 02 18:50:10 crc kubenswrapper[4909]: I1002 18:50:10.106894 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zw6nz"] Oct 02 18:50:10 crc kubenswrapper[4909]: I1002 18:50:10.115439 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rcr26"] Oct 02 18:50:11 crc kubenswrapper[4909]: I1002 18:50:11.624626 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e106d88-ec64-496f-b8a4-ca8137353399" path="/var/lib/kubelet/pods/4e106d88-ec64-496f-b8a4-ca8137353399/volumes" Oct 02 18:50:11 crc kubenswrapper[4909]: I1002 18:50:11.625800 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf2d8c4-630f-4e52-86b5-d20381e90564" path="/var/lib/kubelet/pods/5bf2d8c4-630f-4e52-86b5-d20381e90564/volumes" Oct 02 18:50:18 crc kubenswrapper[4909]: I1002 18:50:18.040200 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-r4lmb"] Oct 02 18:50:18 crc kubenswrapper[4909]: I1002 18:50:18.060243 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-r4lmb"] Oct 02 18:50:19 crc kubenswrapper[4909]: I1002 18:50:19.663908 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af170421-1a33-4e1d-b1ef-83ac5388791c" path="/var/lib/kubelet/pods/af170421-1a33-4e1d-b1ef-83ac5388791c/volumes" Oct 02 18:50:20 crc kubenswrapper[4909]: I1002 18:50:20.067207 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-k7dmx"] Oct 02 18:50:20 crc kubenswrapper[4909]: I1002 18:50:20.078735 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-k7dmx"] Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.189685 4909 scope.go:117] "RemoveContainer" containerID="3ad6fdc0d4c617c974619894478fecf07fd6e1e0f7dcf6f29cfd95062fcf3313" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.236263 4909 scope.go:117] "RemoveContainer" containerID="b95876ca37579b618c5a7c91d05906697b9b6076ed23fc99d4edf4e68a661678" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.309406 4909 scope.go:117] "RemoveContainer" containerID="1f18c200e8b9e3da31fd7d087179b8e4d18b339d06a9a90d5d79bb49b765685e" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.356115 4909 scope.go:117] "RemoveContainer" containerID="de0e11e5ab4efe4d2925fe8200afb644af4140991c8ad1137253fa265bc7b920" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.412419 4909 scope.go:117] "RemoveContainer" containerID="f4d3924233efd779b395c1134fca2e63cb7d8252538b49009c7ae427c4b6b1a1" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.448389 4909 scope.go:117] "RemoveContainer" containerID="e471edfc1a0940cca9a1b1dba3c96eaa76570a69d08a1be101400119dbeb604f" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.491782 4909 scope.go:117] "RemoveContainer" containerID="a80ef33be5c65eafb205abcb4138156b35af41d143261d04494627702ecdb959" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.513498 4909 scope.go:117] "RemoveContainer" containerID="5bef52bd5b082c0b9ddafc5ca237ad00ef1cac67457008a8fd73676187d42abe" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.536676 4909 scope.go:117] "RemoveContainer" containerID="98c22f569ba7eb60269756e1a95dccde274abe4eccbc2eeadfe7366ba54c9e6c" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.563493 4909 scope.go:117] "RemoveContainer" containerID="873613d600e989b17372e2117a3734d79f828585e2e6ca21eec4daf69a22c2df" Oct 02 18:50:21 crc kubenswrapper[4909]: I1002 18:50:21.626775 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4584a9-c2b3-4d6e-976e-131b8b349d79" path="/var/lib/kubelet/pods/ce4584a9-c2b3-4d6e-976e-131b8b349d79/volumes" Oct 02 18:50:46 crc kubenswrapper[4909]: I1002 18:50:46.116537 4909 generic.go:334] "Generic (PLEG): container finished" podID="acdf279c-a962-420a-b005-b5a736105600" containerID="00af8b52e14541e0304fd45dbe7b1527aa154a1fcd5166d39f0c4c73bfa507de" exitCode=0 Oct 02 18:50:46 crc kubenswrapper[4909]: I1002 18:50:46.116952 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" event={"ID":"acdf279c-a962-420a-b005-b5a736105600","Type":"ContainerDied","Data":"00af8b52e14541e0304fd45dbe7b1527aa154a1fcd5166d39f0c4c73bfa507de"} Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.672902 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.765615 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-ssh-key\") pod \"acdf279c-a962-420a-b005-b5a736105600\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.765717 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-inventory\") pod \"acdf279c-a962-420a-b005-b5a736105600\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.765772 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vk8c\" (UniqueName: \"kubernetes.io/projected/acdf279c-a962-420a-b005-b5a736105600-kube-api-access-6vk8c\") pod \"acdf279c-a962-420a-b005-b5a736105600\" (UID: \"acdf279c-a962-420a-b005-b5a736105600\") " Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.771635 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdf279c-a962-420a-b005-b5a736105600-kube-api-access-6vk8c" (OuterVolumeSpecName: "kube-api-access-6vk8c") pod "acdf279c-a962-420a-b005-b5a736105600" (UID: "acdf279c-a962-420a-b005-b5a736105600"). InnerVolumeSpecName "kube-api-access-6vk8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.794222 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "acdf279c-a962-420a-b005-b5a736105600" (UID: "acdf279c-a962-420a-b005-b5a736105600"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.803362 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-inventory" (OuterVolumeSpecName: "inventory") pod "acdf279c-a962-420a-b005-b5a736105600" (UID: "acdf279c-a962-420a-b005-b5a736105600"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.867969 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vk8c\" (UniqueName: \"kubernetes.io/projected/acdf279c-a962-420a-b005-b5a736105600-kube-api-access-6vk8c\") on node \"crc\" DevicePath \"\"" Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.867995 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:50:47 crc kubenswrapper[4909]: I1002 18:50:47.868004 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acdf279c-a962-420a-b005-b5a736105600-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.140687 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" event={"ID":"acdf279c-a962-420a-b005-b5a736105600","Type":"ContainerDied","Data":"5c6dc98c80e4041821f1b6ef0350d30ccf53df97b1f1284dc701f6de61bbf109"} Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.140735 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.140743 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c6dc98c80e4041821f1b6ef0350d30ccf53df97b1f1284dc701f6de61bbf109" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.249356 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5"] Oct 02 18:50:48 crc kubenswrapper[4909]: E1002 18:50:48.249940 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdf279c-a962-420a-b005-b5a736105600" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.249966 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdf279c-a962-420a-b005-b5a736105600" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.250279 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdf279c-a962-420a-b005-b5a736105600" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.251238 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.254792 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.255008 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.255197 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.255350 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.260407 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5"] Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.378319 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.378683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6hv5\" (UniqueName: \"kubernetes.io/projected/4f8285c9-a626-4f86-813b-4e144be2b061-kube-api-access-j6hv5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.378882 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.481120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.481522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.481673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6hv5\" (UniqueName: \"kubernetes.io/projected/4f8285c9-a626-4f86-813b-4e144be2b061-kube-api-access-j6hv5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.485538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.487393 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.505744 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6hv5\" (UniqueName: \"kubernetes.io/projected/4f8285c9-a626-4f86-813b-4e144be2b061-kube-api-access-j6hv5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:48 crc kubenswrapper[4909]: I1002 18:50:48.578004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:49 crc kubenswrapper[4909]: I1002 18:50:49.129696 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5"] Oct 02 18:50:49 crc kubenswrapper[4909]: I1002 18:50:49.151224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" event={"ID":"4f8285c9-a626-4f86-813b-4e144be2b061","Type":"ContainerStarted","Data":"25282426d9137e9f68e421b225bcd7b0db4752254015b217f314eb41ec72103c"} Oct 02 18:50:50 crc kubenswrapper[4909]: I1002 18:50:50.164672 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" event={"ID":"4f8285c9-a626-4f86-813b-4e144be2b061","Type":"ContainerStarted","Data":"f1233fe3c0905275bb9f9d380544c30bba0c3c395f5d3f86c0de77ccfd7c2243"} Oct 02 18:50:50 crc kubenswrapper[4909]: I1002 18:50:50.185495 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" podStartSLOduration=1.564116864 podStartE2EDuration="2.185413729s" podCreationTimestamp="2025-10-02 18:50:48 +0000 UTC" firstStartedPulling="2025-10-02 18:50:49.130301047 +0000 UTC m=+1970.317796906" lastFinishedPulling="2025-10-02 18:50:49.751597882 +0000 UTC m=+1970.939093771" observedRunningTime="2025-10-02 18:50:50.181065874 +0000 UTC m=+1971.368561733" watchObservedRunningTime="2025-10-02 18:50:50.185413729 +0000 UTC m=+1971.372909588" Oct 02 18:50:55 crc kubenswrapper[4909]: I1002 18:50:55.218320 4909 generic.go:334] "Generic (PLEG): container finished" podID="4f8285c9-a626-4f86-813b-4e144be2b061" containerID="f1233fe3c0905275bb9f9d380544c30bba0c3c395f5d3f86c0de77ccfd7c2243" exitCode=0 Oct 02 18:50:55 crc kubenswrapper[4909]: I1002 18:50:55.218428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" event={"ID":"4f8285c9-a626-4f86-813b-4e144be2b061","Type":"ContainerDied","Data":"f1233fe3c0905275bb9f9d380544c30bba0c3c395f5d3f86c0de77ccfd7c2243"} Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.772546 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.873306 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-inventory\") pod \"4f8285c9-a626-4f86-813b-4e144be2b061\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.873414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6hv5\" (UniqueName: \"kubernetes.io/projected/4f8285c9-a626-4f86-813b-4e144be2b061-kube-api-access-j6hv5\") pod \"4f8285c9-a626-4f86-813b-4e144be2b061\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.873555 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-ssh-key\") pod \"4f8285c9-a626-4f86-813b-4e144be2b061\" (UID: \"4f8285c9-a626-4f86-813b-4e144be2b061\") " Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.881106 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8285c9-a626-4f86-813b-4e144be2b061-kube-api-access-j6hv5" (OuterVolumeSpecName: "kube-api-access-j6hv5") pod "4f8285c9-a626-4f86-813b-4e144be2b061" (UID: "4f8285c9-a626-4f86-813b-4e144be2b061"). InnerVolumeSpecName "kube-api-access-j6hv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.905361 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f8285c9-a626-4f86-813b-4e144be2b061" (UID: "4f8285c9-a626-4f86-813b-4e144be2b061"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.931699 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-inventory" (OuterVolumeSpecName: "inventory") pod "4f8285c9-a626-4f86-813b-4e144be2b061" (UID: "4f8285c9-a626-4f86-813b-4e144be2b061"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.976309 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.976349 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6hv5\" (UniqueName: \"kubernetes.io/projected/4f8285c9-a626-4f86-813b-4e144be2b061-kube-api-access-j6hv5\") on node \"crc\" DevicePath \"\"" Oct 02 18:50:56 crc kubenswrapper[4909]: I1002 18:50:56.976363 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f8285c9-a626-4f86-813b-4e144be2b061-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.243964 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" event={"ID":"4f8285c9-a626-4f86-813b-4e144be2b061","Type":"ContainerDied","Data":"25282426d9137e9f68e421b225bcd7b0db4752254015b217f314eb41ec72103c"} Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.244021 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25282426d9137e9f68e421b225bcd7b0db4752254015b217f314eb41ec72103c" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.244609 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.354718 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725"] Oct 02 18:50:57 crc kubenswrapper[4909]: E1002 18:50:57.355344 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8285c9-a626-4f86-813b-4e144be2b061" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.355372 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8285c9-a626-4f86-813b-4e144be2b061" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.355764 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8285c9-a626-4f86-813b-4e144be2b061" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.357123 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.359906 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.360375 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.360660 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.366937 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725"] Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.407280 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.511734 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.511778 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j426j\" (UniqueName: \"kubernetes.io/projected/542798b4-6ef4-45c9-b91e-d692cb757dae-kube-api-access-j426j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.512352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.614518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.614715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j426j\" (UniqueName: \"kubernetes.io/projected/542798b4-6ef4-45c9-b91e-d692cb757dae-kube-api-access-j426j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.614810 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.618110 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.621104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.647120 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j426j\" (UniqueName: \"kubernetes.io/projected/542798b4-6ef4-45c9-b91e-d692cb757dae-kube-api-access-j426j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qb725\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:57 crc kubenswrapper[4909]: I1002 18:50:57.719113 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:50:58 crc kubenswrapper[4909]: I1002 18:50:58.347982 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725"] Oct 02 18:50:59 crc kubenswrapper[4909]: I1002 18:50:59.275083 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" event={"ID":"542798b4-6ef4-45c9-b91e-d692cb757dae","Type":"ContainerStarted","Data":"44f1316a2a028c431a9377517d40c47ac031ea9620ef580224bd4b91a4dc3564"} Oct 02 18:50:59 crc kubenswrapper[4909]: I1002 18:50:59.275714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" event={"ID":"542798b4-6ef4-45c9-b91e-d692cb757dae","Type":"ContainerStarted","Data":"095dc6390935b9a2690f55ee2406e9192f5836b8dec2aa0f805a760196c7a731"} Oct 02 18:50:59 crc kubenswrapper[4909]: I1002 18:50:59.311267 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" podStartSLOduration=1.8821277950000002 podStartE2EDuration="2.311230186s" podCreationTimestamp="2025-10-02 18:50:57 +0000 UTC" firstStartedPulling="2025-10-02 18:50:58.345292425 +0000 UTC m=+1979.532788274" lastFinishedPulling="2025-10-02 18:50:58.774394766 +0000 UTC m=+1979.961890665" observedRunningTime="2025-10-02 18:50:59.30301022 +0000 UTC m=+1980.490506159" watchObservedRunningTime="2025-10-02 18:50:59.311230186 +0000 UTC m=+1980.498726075" Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.055538 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zcxfk"] Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.076139 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8c5jp"] Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.087310 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8c5jp"] Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.099900 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zcxfk"] Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.111939 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-r29fp"] Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.119763 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-r29fp"] Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.624608 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930a4d62-49c6-4a03-87d1-a91e08c5f01f" path="/var/lib/kubelet/pods/930a4d62-49c6-4a03-87d1-a91e08c5f01f/volumes" Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.625233 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41e0c4c-75a5-47e0-8e07-4b851ff9feda" path="/var/lib/kubelet/pods/a41e0c4c-75a5-47e0-8e07-4b851ff9feda/volumes" Oct 02 18:51:13 crc kubenswrapper[4909]: I1002 18:51:13.625729 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a1121e-8850-4ae8-9f23-058732d8cf96" path="/var/lib/kubelet/pods/c4a1121e-8850-4ae8-9f23-058732d8cf96/volumes" Oct 02 18:51:21 crc kubenswrapper[4909]: I1002 18:51:21.830777 4909 scope.go:117] "RemoveContainer" containerID="c3bcdcd1d51962bce660a6cbcabd71ff46ae088232d092070aafb299e543cba2" Oct 02 18:51:21 crc kubenswrapper[4909]: I1002 18:51:21.874933 4909 scope.go:117] "RemoveContainer" containerID="d9afede484450c6c68a142bc799051644f2f72f742b4d07ed72f4d21f66d438b" Oct 02 18:51:21 crc kubenswrapper[4909]: I1002 18:51:21.915666 4909 scope.go:117] "RemoveContainer" containerID="a1705cbe5e754667c3244841ddd57de8b2706814894832bcff4e971b74b672c7" Oct 02 18:51:21 crc kubenswrapper[4909]: I1002 18:51:21.977783 4909 scope.go:117] "RemoveContainer" containerID="53aeb06a6630555f39a15cbb85b64809fcc92f946ae25b0709a9a8c2a9776ce3" Oct 02 18:51:22 crc kubenswrapper[4909]: I1002 18:51:22.042850 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-dcf5-account-create-bt9p8"] Oct 02 18:51:22 crc kubenswrapper[4909]: I1002 18:51:22.060547 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-dcf5-account-create-bt9p8"] Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.031319 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5e90-account-create-4mc5m"] Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.041918 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-752e-account-create-xqbw8"] Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.050552 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5e90-account-create-4mc5m"] Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.058457 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-752e-account-create-xqbw8"] Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.625228 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a27b3d-9b2f-4734-988b-e72baa85082d" path="/var/lib/kubelet/pods/37a27b3d-9b2f-4734-988b-e72baa85082d/volumes" Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.626424 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6f61e1-d6fa-438d-9382-d9548af74f3f" path="/var/lib/kubelet/pods/3b6f61e1-d6fa-438d-9382-d9548af74f3f/volumes" Oct 02 18:51:23 crc kubenswrapper[4909]: I1002 18:51:23.627690 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7ec535-cc21-449a-88f0-efdf67c0119f" path="/var/lib/kubelet/pods/bd7ec535-cc21-449a-88f0-efdf67c0119f/volumes" Oct 02 18:51:46 crc kubenswrapper[4909]: I1002 18:51:46.045078 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-9gdjk"] Oct 02 18:51:46 crc kubenswrapper[4909]: I1002 18:51:46.061068 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-9gdjk"] Oct 02 18:51:47 crc kubenswrapper[4909]: I1002 18:51:47.626992 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8967b4-4b6b-4de7-bd0f-29017cc79ca9" path="/var/lib/kubelet/pods/db8967b4-4b6b-4de7-bd0f-29017cc79ca9/volumes" Oct 02 18:51:50 crc kubenswrapper[4909]: I1002 18:51:50.063887 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tjnr"] Oct 02 18:51:50 crc kubenswrapper[4909]: I1002 18:51:50.077326 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tjnr"] Oct 02 18:51:51 crc kubenswrapper[4909]: I1002 18:51:51.623509 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c9f259-4615-48e3-9c0c-89c320e100e0" path="/var/lib/kubelet/pods/68c9f259-4615-48e3-9c0c-89c320e100e0/volumes" Oct 02 18:51:53 crc kubenswrapper[4909]: I1002 18:51:53.054110 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:51:53 crc kubenswrapper[4909]: I1002 18:51:53.054490 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:51:58 crc kubenswrapper[4909]: I1002 18:51:58.047008 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-d265-account-create-nxdxn"] Oct 02 18:51:58 crc kubenswrapper[4909]: I1002 18:51:58.060558 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-d265-account-create-nxdxn"] Oct 02 18:51:59 crc kubenswrapper[4909]: I1002 18:51:59.045351 4909 generic.go:334] "Generic (PLEG): container finished" podID="542798b4-6ef4-45c9-b91e-d692cb757dae" containerID="44f1316a2a028c431a9377517d40c47ac031ea9620ef580224bd4b91a4dc3564" exitCode=2 Oct 02 18:51:59 crc kubenswrapper[4909]: I1002 18:51:59.045459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" event={"ID":"542798b4-6ef4-45c9-b91e-d692cb757dae","Type":"ContainerDied","Data":"44f1316a2a028c431a9377517d40c47ac031ea9620ef580224bd4b91a4dc3564"} Oct 02 18:51:59 crc kubenswrapper[4909]: I1002 18:51:59.634650 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d54c6c8-3bf0-462b-b69a-fd242e123ae5" path="/var/lib/kubelet/pods/9d54c6c8-3bf0-462b-b69a-fd242e123ae5/volumes" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.626038 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.724171 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-inventory\") pod \"542798b4-6ef4-45c9-b91e-d692cb757dae\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.724281 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j426j\" (UniqueName: \"kubernetes.io/projected/542798b4-6ef4-45c9-b91e-d692cb757dae-kube-api-access-j426j\") pod \"542798b4-6ef4-45c9-b91e-d692cb757dae\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.724411 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-ssh-key\") pod \"542798b4-6ef4-45c9-b91e-d692cb757dae\" (UID: \"542798b4-6ef4-45c9-b91e-d692cb757dae\") " Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.729929 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542798b4-6ef4-45c9-b91e-d692cb757dae-kube-api-access-j426j" (OuterVolumeSpecName: "kube-api-access-j426j") pod "542798b4-6ef4-45c9-b91e-d692cb757dae" (UID: "542798b4-6ef4-45c9-b91e-d692cb757dae"). InnerVolumeSpecName "kube-api-access-j426j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.775614 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "542798b4-6ef4-45c9-b91e-d692cb757dae" (UID: "542798b4-6ef4-45c9-b91e-d692cb757dae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.779114 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-inventory" (OuterVolumeSpecName: "inventory") pod "542798b4-6ef4-45c9-b91e-d692cb757dae" (UID: "542798b4-6ef4-45c9-b91e-d692cb757dae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.827177 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.827389 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j426j\" (UniqueName: \"kubernetes.io/projected/542798b4-6ef4-45c9-b91e-d692cb757dae-kube-api-access-j426j\") on node \"crc\" DevicePath \"\"" Oct 02 18:52:00 crc kubenswrapper[4909]: I1002 18:52:00.827480 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/542798b4-6ef4-45c9-b91e-d692cb757dae-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:52:01 crc kubenswrapper[4909]: I1002 18:52:01.070005 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" event={"ID":"542798b4-6ef4-45c9-b91e-d692cb757dae","Type":"ContainerDied","Data":"095dc6390935b9a2690f55ee2406e9192f5836b8dec2aa0f805a760196c7a731"} Oct 02 18:52:01 crc kubenswrapper[4909]: I1002 18:52:01.070102 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="095dc6390935b9a2690f55ee2406e9192f5836b8dec2aa0f805a760196c7a731" Oct 02 18:52:01 crc kubenswrapper[4909]: I1002 18:52:01.070156 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.048622 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr"] Oct 02 18:52:08 crc kubenswrapper[4909]: E1002 18:52:08.050233 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542798b4-6ef4-45c9-b91e-d692cb757dae" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.050299 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="542798b4-6ef4-45c9-b91e-d692cb757dae" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.051160 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="542798b4-6ef4-45c9-b91e-d692cb757dae" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.053441 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.060500 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.060506 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.065700 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.066656 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.073611 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr"] Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.219491 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.219543 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.219692 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7jp\" (UniqueName: \"kubernetes.io/projected/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-kube-api-access-4r7jp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.322098 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7jp\" (UniqueName: \"kubernetes.io/projected/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-kube-api-access-4r7jp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.322372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.322408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.331058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.331450 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.353350 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7jp\" (UniqueName: \"kubernetes.io/projected/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-kube-api-access-4r7jp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-94xbr\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.387829 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:52:08 crc kubenswrapper[4909]: I1002 18:52:08.975299 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr"] Oct 02 18:52:08 crc kubenswrapper[4909]: W1002 18:52:08.976650 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdd9d4a_c13e_4dec_9808_07d9dd4b12ed.slice/crio-299a515cca57b0bf40cb85a9f90b1ceabe8579185a6920698f4cc23c3fcb05a8 WatchSource:0}: Error finding container 299a515cca57b0bf40cb85a9f90b1ceabe8579185a6920698f4cc23c3fcb05a8: Status 404 returned error can't find the container with id 299a515cca57b0bf40cb85a9f90b1ceabe8579185a6920698f4cc23c3fcb05a8 Oct 02 18:52:09 crc kubenswrapper[4909]: I1002 18:52:09.174508 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" event={"ID":"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed","Type":"ContainerStarted","Data":"299a515cca57b0bf40cb85a9f90b1ceabe8579185a6920698f4cc23c3fcb05a8"} Oct 02 18:52:10 crc kubenswrapper[4909]: I1002 18:52:10.194324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" event={"ID":"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed","Type":"ContainerStarted","Data":"3b376c0364942d95e88d49b66642f6d0b1664c36b59c4ccfe9c72b5e87a948dd"} Oct 02 18:52:10 crc kubenswrapper[4909]: I1002 18:52:10.225243 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" podStartSLOduration=1.748466136 podStartE2EDuration="2.225210823s" podCreationTimestamp="2025-10-02 18:52:08 +0000 UTC" firstStartedPulling="2025-10-02 18:52:08.979689732 +0000 UTC m=+2050.167185591" lastFinishedPulling="2025-10-02 18:52:09.456434419 +0000 UTC m=+2050.643930278" observedRunningTime="2025-10-02 18:52:10.223318044 +0000 UTC m=+2051.410813943" watchObservedRunningTime="2025-10-02 18:52:10.225210823 +0000 UTC m=+2051.412706732" Oct 02 18:52:15 crc kubenswrapper[4909]: I1002 18:52:15.033217 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlfn6"] Oct 02 18:52:15 crc kubenswrapper[4909]: I1002 18:52:15.040005 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlfn6"] Oct 02 18:52:15 crc kubenswrapper[4909]: I1002 18:52:15.628938 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57665172-f334-4a30-b3d0-60b0a50183e9" path="/var/lib/kubelet/pods/57665172-f334-4a30-b3d0-60b0a50183e9/volumes" Oct 02 18:52:20 crc kubenswrapper[4909]: I1002 18:52:20.079088 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vv8d"] Oct 02 18:52:20 crc kubenswrapper[4909]: I1002 18:52:20.094267 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vv8d"] Oct 02 18:52:21 crc kubenswrapper[4909]: I1002 18:52:21.622251 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3230517-79be-47ce-83ca-f423e167c5f1" path="/var/lib/kubelet/pods/e3230517-79be-47ce-83ca-f423e167c5f1/volumes" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.110471 4909 scope.go:117] "RemoveContainer" containerID="c192334de91920dea5030d7f865209eacf5b8e42b49b13ee26662b61c15d9117" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.164345 4909 scope.go:117] "RemoveContainer" containerID="c87168b1cf9ab041982b89023cd2d10690bb35dbb16e9436113a665069e2e1f5" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.191638 4909 scope.go:117] "RemoveContainer" containerID="268d53a819a1f6e76c1c0686344787a16b60cb3419710ead08d8f36cf7ee8793" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.253545 4909 scope.go:117] "RemoveContainer" containerID="933b48d8afb9e3d36873eedf6cf8456da8b6c05dbda6c1d9c7f32c4cf3344cea" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.297236 4909 scope.go:117] "RemoveContainer" containerID="2c9f9515b5d2294949cef5236454c43f368efcf326b1c5b3406e0771f524ec49" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.362913 4909 scope.go:117] "RemoveContainer" containerID="040c561f2c96fa18b46a5a71e966700017bf9ab5381acb6a5173564b27c33506" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.393123 4909 scope.go:117] "RemoveContainer" containerID="465296b3e3c237d3e9f2a6061f2ea5c13459c17911c91bd98391c2cc51d3789a" Oct 02 18:52:22 crc kubenswrapper[4909]: I1002 18:52:22.412431 4909 scope.go:117] "RemoveContainer" containerID="776b21505c4ed56155dc4d3af85f15489de7333b08067fbc91be6f7cbf0da4f1" Oct 02 18:52:23 crc kubenswrapper[4909]: I1002 18:52:23.059421 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:52:23 crc kubenswrapper[4909]: I1002 18:52:23.061106 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.141233 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t9ww5"] Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.144616 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.156368 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9ww5"] Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.187221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gdg\" (UniqueName: \"kubernetes.io/projected/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-kube-api-access-v9gdg\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.187699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-utilities\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.187941 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-catalog-content\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.290282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-catalog-content\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.290704 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gdg\" (UniqueName: \"kubernetes.io/projected/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-kube-api-access-v9gdg\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.290883 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-utilities\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.291303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-catalog-content\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.291348 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-utilities\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.309239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gdg\" (UniqueName: \"kubernetes.io/projected/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-kube-api-access-v9gdg\") pod \"community-operators-t9ww5\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:49 crc kubenswrapper[4909]: I1002 18:52:49.476394 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:50 crc kubenswrapper[4909]: I1002 18:52:50.016241 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9ww5"] Oct 02 18:52:50 crc kubenswrapper[4909]: I1002 18:52:50.715154 4909 generic.go:334] "Generic (PLEG): container finished" podID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerID="5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d" exitCode=0 Oct 02 18:52:50 crc kubenswrapper[4909]: I1002 18:52:50.715603 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerDied","Data":"5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d"} Oct 02 18:52:50 crc kubenswrapper[4909]: I1002 18:52:50.716174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerStarted","Data":"af8b1abeaf79481bfa65bdcc7a2c36ceeb05f1d51c89bf0a1bd9f9622644c5c7"} Oct 02 18:52:52 crc kubenswrapper[4909]: I1002 18:52:52.735180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerStarted","Data":"b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac"} Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.054550 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.054605 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.054649 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.055400 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a91abf9faef5ed8796c146ef7a0e1bf6166a28a8a51bea1801fb3fe8045d6573"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.055459 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://a91abf9faef5ed8796c146ef7a0e1bf6166a28a8a51bea1801fb3fe8045d6573" gracePeriod=600 Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.748323 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="a91abf9faef5ed8796c146ef7a0e1bf6166a28a8a51bea1801fb3fe8045d6573" exitCode=0 Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.748451 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"a91abf9faef5ed8796c146ef7a0e1bf6166a28a8a51bea1801fb3fe8045d6573"} Oct 02 18:52:53 crc kubenswrapper[4909]: I1002 18:52:53.748703 4909 scope.go:117] "RemoveContainer" containerID="8d2bfeedfae96d267362c5c9e26fdf3c2e1c91c831940186facbc9cafe836142" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.516170 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9wm2f"] Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.519314 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.528271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-catalog-content\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.528335 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wm2f"] Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.528725 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jpg4\" (UniqueName: \"kubernetes.io/projected/4b1f5c84-3099-46fd-8901-91ce9a9fced1-kube-api-access-6jpg4\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.528876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-utilities\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.630417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-catalog-content\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.630551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jpg4\" (UniqueName: \"kubernetes.io/projected/4b1f5c84-3099-46fd-8901-91ce9a9fced1-kube-api-access-6jpg4\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.630626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-utilities\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.631323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-catalog-content\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.631349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-utilities\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.666408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jpg4\" (UniqueName: \"kubernetes.io/projected/4b1f5c84-3099-46fd-8901-91ce9a9fced1-kube-api-access-6jpg4\") pod \"redhat-marketplace-9wm2f\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.783978 4909 generic.go:334] "Generic (PLEG): container finished" podID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerID="b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac" exitCode=0 Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.784161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerDied","Data":"b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac"} Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.790169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172"} Oct 02 18:52:54 crc kubenswrapper[4909]: I1002 18:52:54.841366 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:52:55 crc kubenswrapper[4909]: I1002 18:52:55.178767 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wm2f"] Oct 02 18:52:55 crc kubenswrapper[4909]: I1002 18:52:55.801818 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerStarted","Data":"0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202"} Oct 02 18:52:55 crc kubenswrapper[4909]: I1002 18:52:55.804283 4909 generic.go:334] "Generic (PLEG): container finished" podID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerID="7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea" exitCode=0 Oct 02 18:52:55 crc kubenswrapper[4909]: I1002 18:52:55.804394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerDied","Data":"7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea"} Oct 02 18:52:55 crc kubenswrapper[4909]: I1002 18:52:55.804446 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerStarted","Data":"55dfbef8fe989084d305225fade4d86883e069e44c3e0aea778b9306f16a0ea0"} Oct 02 18:52:55 crc kubenswrapper[4909]: I1002 18:52:55.824681 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t9ww5" podStartSLOduration=2.354333128 podStartE2EDuration="6.8246648s" podCreationTimestamp="2025-10-02 18:52:49 +0000 UTC" firstStartedPulling="2025-10-02 18:52:50.71751473 +0000 UTC m=+2091.905010599" lastFinishedPulling="2025-10-02 18:52:55.187846422 +0000 UTC m=+2096.375342271" observedRunningTime="2025-10-02 18:52:55.821643266 +0000 UTC m=+2097.009139135" watchObservedRunningTime="2025-10-02 18:52:55.8246648 +0000 UTC m=+2097.012160669" Oct 02 18:52:56 crc kubenswrapper[4909]: I1002 18:52:56.816480 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerStarted","Data":"cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d"} Oct 02 18:52:57 crc kubenswrapper[4909]: I1002 18:52:57.843485 4909 generic.go:334] "Generic (PLEG): container finished" podID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerID="cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d" exitCode=0 Oct 02 18:52:57 crc kubenswrapper[4909]: I1002 18:52:57.843535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerDied","Data":"cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d"} Oct 02 18:52:59 crc kubenswrapper[4909]: I1002 18:52:59.477388 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:59 crc kubenswrapper[4909]: I1002 18:52:59.478008 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:52:59 crc kubenswrapper[4909]: I1002 18:52:59.868010 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerStarted","Data":"441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf"} Oct 02 18:52:59 crc kubenswrapper[4909]: I1002 18:52:59.891384 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9wm2f" podStartSLOduration=3.139515491 podStartE2EDuration="5.891368005s" podCreationTimestamp="2025-10-02 18:52:54 +0000 UTC" firstStartedPulling="2025-10-02 18:52:55.806590236 +0000 UTC m=+2096.994086105" lastFinishedPulling="2025-10-02 18:52:58.55844275 +0000 UTC m=+2099.745938619" observedRunningTime="2025-10-02 18:52:59.88836243 +0000 UTC m=+2101.075858299" watchObservedRunningTime="2025-10-02 18:52:59.891368005 +0000 UTC m=+2101.078863864" Oct 02 18:53:00 crc kubenswrapper[4909]: I1002 18:53:00.543154 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-t9ww5" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="registry-server" probeResult="failure" output=< Oct 02 18:53:00 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:53:00 crc kubenswrapper[4909]: > Oct 02 18:53:03 crc kubenswrapper[4909]: I1002 18:53:03.912407 4909 generic.go:334] "Generic (PLEG): container finished" podID="abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" containerID="3b376c0364942d95e88d49b66642f6d0b1664c36b59c4ccfe9c72b5e87a948dd" exitCode=0 Oct 02 18:53:03 crc kubenswrapper[4909]: I1002 18:53:03.912448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" event={"ID":"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed","Type":"ContainerDied","Data":"3b376c0364942d95e88d49b66642f6d0b1664c36b59c4ccfe9c72b5e87a948dd"} Oct 02 18:53:04 crc kubenswrapper[4909]: I1002 18:53:04.051978 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jl4jd"] Oct 02 18:53:04 crc kubenswrapper[4909]: I1002 18:53:04.066906 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jl4jd"] Oct 02 18:53:04 crc kubenswrapper[4909]: I1002 18:53:04.841660 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:53:04 crc kubenswrapper[4909]: I1002 18:53:04.841723 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:53:04 crc kubenswrapper[4909]: I1002 18:53:04.914080 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.000699 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.173456 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wm2f"] Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.386492 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.490551 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r7jp\" (UniqueName: \"kubernetes.io/projected/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-kube-api-access-4r7jp\") pod \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.490672 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-inventory\") pod \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.490726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-ssh-key\") pod \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\" (UID: \"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed\") " Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.497426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-kube-api-access-4r7jp" (OuterVolumeSpecName: "kube-api-access-4r7jp") pod "abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" (UID: "abdd9d4a-c13e-4dec-9808-07d9dd4b12ed"). InnerVolumeSpecName "kube-api-access-4r7jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.523987 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-inventory" (OuterVolumeSpecName: "inventory") pod "abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" (UID: "abdd9d4a-c13e-4dec-9808-07d9dd4b12ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.532991 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" (UID: "abdd9d4a-c13e-4dec-9808-07d9dd4b12ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.593760 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r7jp\" (UniqueName: \"kubernetes.io/projected/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-kube-api-access-4r7jp\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.593803 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.593816 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.645600 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce99eed-6e71-4b26-9ea2-5f5cd06cda4e" path="/var/lib/kubelet/pods/bce99eed-6e71-4b26-9ea2-5f5cd06cda4e/volumes" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.937948 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.937941 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr" event={"ID":"abdd9d4a-c13e-4dec-9808-07d9dd4b12ed","Type":"ContainerDied","Data":"299a515cca57b0bf40cb85a9f90b1ceabe8579185a6920698f4cc23c3fcb05a8"} Oct 02 18:53:05 crc kubenswrapper[4909]: I1002 18:53:05.939008 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299a515cca57b0bf40cb85a9f90b1ceabe8579185a6920698f4cc23c3fcb05a8" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.032248 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n9x86"] Oct 02 18:53:06 crc kubenswrapper[4909]: E1002 18:53:06.033229 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.033366 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.033797 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.035211 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.038666 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.038820 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.038874 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.039176 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.046141 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n9x86"] Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.207008 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvmd\" (UniqueName: \"kubernetes.io/projected/38813735-9677-4533-a46e-f07e0fe43cdc-kube-api-access-gzvmd\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.207509 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.207684 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.309523 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvmd\" (UniqueName: \"kubernetes.io/projected/38813735-9677-4533-a46e-f07e0fe43cdc-kube-api-access-gzvmd\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.309627 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.309677 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.315287 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.329320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.329929 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvmd\" (UniqueName: \"kubernetes.io/projected/38813735-9677-4533-a46e-f07e0fe43cdc-kube-api-access-gzvmd\") pod \"ssh-known-hosts-edpm-deployment-n9x86\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.355018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.952108 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9wm2f" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="registry-server" containerID="cri-o://441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf" gracePeriod=2 Oct 02 18:53:06 crc kubenswrapper[4909]: I1002 18:53:06.965452 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n9x86"] Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.483774 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.539921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-catalog-content\") pod \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.540011 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-utilities\") pod \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.540091 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jpg4\" (UniqueName: \"kubernetes.io/projected/4b1f5c84-3099-46fd-8901-91ce9a9fced1-kube-api-access-6jpg4\") pod \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\" (UID: \"4b1f5c84-3099-46fd-8901-91ce9a9fced1\") " Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.540884 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-utilities" (OuterVolumeSpecName: "utilities") pod "4b1f5c84-3099-46fd-8901-91ce9a9fced1" (UID: "4b1f5c84-3099-46fd-8901-91ce9a9fced1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.547225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1f5c84-3099-46fd-8901-91ce9a9fced1-kube-api-access-6jpg4" (OuterVolumeSpecName: "kube-api-access-6jpg4") pod "4b1f5c84-3099-46fd-8901-91ce9a9fced1" (UID: "4b1f5c84-3099-46fd-8901-91ce9a9fced1"). InnerVolumeSpecName "kube-api-access-6jpg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.556996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b1f5c84-3099-46fd-8901-91ce9a9fced1" (UID: "4b1f5c84-3099-46fd-8901-91ce9a9fced1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.642724 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.642765 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1f5c84-3099-46fd-8901-91ce9a9fced1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.642786 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jpg4\" (UniqueName: \"kubernetes.io/projected/4b1f5c84-3099-46fd-8901-91ce9a9fced1-kube-api-access-6jpg4\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.968733 4909 generic.go:334] "Generic (PLEG): container finished" podID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerID="441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf" exitCode=0 Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.968791 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerDied","Data":"441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf"} Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.968873 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wm2f" event={"ID":"4b1f5c84-3099-46fd-8901-91ce9a9fced1","Type":"ContainerDied","Data":"55dfbef8fe989084d305225fade4d86883e069e44c3e0aea778b9306f16a0ea0"} Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.968904 4909 scope.go:117] "RemoveContainer" containerID="441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.968963 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wm2f" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:07.973074 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" event={"ID":"38813735-9677-4533-a46e-f07e0fe43cdc","Type":"ContainerStarted","Data":"558c59bebfac33cbe8722cb75ac34b33a82816189716a758940ad27aefac15e9"} Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.011649 4909 scope.go:117] "RemoveContainer" containerID="cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.021293 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wm2f"] Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.039449 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wm2f"] Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.048380 4909 scope.go:117] "RemoveContainer" containerID="7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.090299 4909 scope.go:117] "RemoveContainer" containerID="441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf" Oct 02 18:53:08 crc kubenswrapper[4909]: E1002 18:53:08.091228 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf\": container with ID starting with 441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf not found: ID does not exist" containerID="441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.091272 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf"} err="failed to get container status \"441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf\": rpc error: code = NotFound desc = could not find container \"441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf\": container with ID starting with 441a833e116194d1abe0520e2047d4a3bdf3e96e198d29c483c95785dd6324cf not found: ID does not exist" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.091302 4909 scope.go:117] "RemoveContainer" containerID="cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d" Oct 02 18:53:08 crc kubenswrapper[4909]: E1002 18:53:08.091845 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d\": container with ID starting with cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d not found: ID does not exist" containerID="cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.091903 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d"} err="failed to get container status \"cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d\": rpc error: code = NotFound desc = could not find container \"cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d\": container with ID starting with cc8585dc47dd2505056f8b61834881ce1889db6f3ef9c671c14a94439ca55a2d not found: ID does not exist" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.091942 4909 scope.go:117] "RemoveContainer" containerID="7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea" Oct 02 18:53:08 crc kubenswrapper[4909]: E1002 18:53:08.092385 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea\": container with ID starting with 7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea not found: ID does not exist" containerID="7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.092410 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea"} err="failed to get container status \"7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea\": rpc error: code = NotFound desc = could not find container \"7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea\": container with ID starting with 7e4061b0946d1b9c5dd98ba0836acb8802070ab9290572ba7ce2111d7d8079ea not found: ID does not exist" Oct 02 18:53:08 crc kubenswrapper[4909]: I1002 18:53:08.986949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" event={"ID":"38813735-9677-4533-a46e-f07e0fe43cdc","Type":"ContainerStarted","Data":"27cf801efd2f22348cf144e13997d30068e84def0a48004f2064b0e60ad2da5d"} Oct 02 18:53:09 crc kubenswrapper[4909]: I1002 18:53:09.015571 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" podStartSLOduration=1.6188350950000001 podStartE2EDuration="3.015547349s" podCreationTimestamp="2025-10-02 18:53:06 +0000 UTC" firstStartedPulling="2025-10-02 18:53:06.990886614 +0000 UTC m=+2108.178382473" lastFinishedPulling="2025-10-02 18:53:08.387598868 +0000 UTC m=+2109.575094727" observedRunningTime="2025-10-02 18:53:09.009711468 +0000 UTC m=+2110.197207337" watchObservedRunningTime="2025-10-02 18:53:09.015547349 +0000 UTC m=+2110.203043218" Oct 02 18:53:09 crc kubenswrapper[4909]: I1002 18:53:09.549632 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:53:09 crc kubenswrapper[4909]: I1002 18:53:09.640247 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" path="/var/lib/kubelet/pods/4b1f5c84-3099-46fd-8901-91ce9a9fced1/volumes" Oct 02 18:53:09 crc kubenswrapper[4909]: I1002 18:53:09.641759 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:53:10 crc kubenswrapper[4909]: I1002 18:53:10.564730 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9ww5"] Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.015428 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t9ww5" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="registry-server" containerID="cri-o://0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202" gracePeriod=2 Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.725582 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.832055 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gdg\" (UniqueName: \"kubernetes.io/projected/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-kube-api-access-v9gdg\") pod \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.832128 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-catalog-content\") pod \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.832506 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-utilities\") pod \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\" (UID: \"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13\") " Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.833531 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-utilities" (OuterVolumeSpecName: "utilities") pod "79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" (UID: "79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.838660 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-kube-api-access-v9gdg" (OuterVolumeSpecName: "kube-api-access-v9gdg") pod "79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" (UID: "79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13"). InnerVolumeSpecName "kube-api-access-v9gdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.899843 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" (UID: "79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.935191 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gdg\" (UniqueName: \"kubernetes.io/projected/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-kube-api-access-v9gdg\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.935325 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:11 crc kubenswrapper[4909]: I1002 18:53:11.935344 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.029393 4909 generic.go:334] "Generic (PLEG): container finished" podID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerID="0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202" exitCode=0 Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.029438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerDied","Data":"0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202"} Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.029484 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9ww5" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.029501 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9ww5" event={"ID":"79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13","Type":"ContainerDied","Data":"af8b1abeaf79481bfa65bdcc7a2c36ceeb05f1d51c89bf0a1bd9f9622644c5c7"} Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.029529 4909 scope.go:117] "RemoveContainer" containerID="0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.071416 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9ww5"] Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.072943 4909 scope.go:117] "RemoveContainer" containerID="b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.081324 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t9ww5"] Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.103815 4909 scope.go:117] "RemoveContainer" containerID="5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.145898 4909 scope.go:117] "RemoveContainer" containerID="0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202" Oct 02 18:53:12 crc kubenswrapper[4909]: E1002 18:53:12.146392 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202\": container with ID starting with 0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202 not found: ID does not exist" containerID="0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.146421 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202"} err="failed to get container status \"0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202\": rpc error: code = NotFound desc = could not find container \"0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202\": container with ID starting with 0a457b30bc64bf914515884c9c07e5ed46bac6bc01c59ac8dac168184b959202 not found: ID does not exist" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.146444 4909 scope.go:117] "RemoveContainer" containerID="b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac" Oct 02 18:53:12 crc kubenswrapper[4909]: E1002 18:53:12.146790 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac\": container with ID starting with b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac not found: ID does not exist" containerID="b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.146836 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac"} err="failed to get container status \"b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac\": rpc error: code = NotFound desc = could not find container \"b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac\": container with ID starting with b54f946b808c660f667afb7fa4b69a09b2215babbd479b4e771bd3604a9a79ac not found: ID does not exist" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.146872 4909 scope.go:117] "RemoveContainer" containerID="5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d" Oct 02 18:53:12 crc kubenswrapper[4909]: E1002 18:53:12.147213 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d\": container with ID starting with 5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d not found: ID does not exist" containerID="5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d" Oct 02 18:53:12 crc kubenswrapper[4909]: I1002 18:53:12.147305 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d"} err="failed to get container status \"5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d\": rpc error: code = NotFound desc = could not find container \"5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d\": container with ID starting with 5d7a9836ee5455166b823249d480e955d071429f9a80a352801bdd7cd2e3b65d not found: ID does not exist" Oct 02 18:53:13 crc kubenswrapper[4909]: I1002 18:53:13.629329 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" path="/var/lib/kubelet/pods/79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13/volumes" Oct 02 18:53:17 crc kubenswrapper[4909]: I1002 18:53:17.087148 4909 generic.go:334] "Generic (PLEG): container finished" podID="38813735-9677-4533-a46e-f07e0fe43cdc" containerID="27cf801efd2f22348cf144e13997d30068e84def0a48004f2064b0e60ad2da5d" exitCode=0 Oct 02 18:53:17 crc kubenswrapper[4909]: I1002 18:53:17.087202 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" event={"ID":"38813735-9677-4533-a46e-f07e0fe43cdc","Type":"ContainerDied","Data":"27cf801efd2f22348cf144e13997d30068e84def0a48004f2064b0e60ad2da5d"} Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.578911 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.680144 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzvmd\" (UniqueName: \"kubernetes.io/projected/38813735-9677-4533-a46e-f07e0fe43cdc-kube-api-access-gzvmd\") pod \"38813735-9677-4533-a46e-f07e0fe43cdc\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.680736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-inventory-0\") pod \"38813735-9677-4533-a46e-f07e0fe43cdc\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.680831 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-ssh-key-openstack-edpm-ipam\") pod \"38813735-9677-4533-a46e-f07e0fe43cdc\" (UID: \"38813735-9677-4533-a46e-f07e0fe43cdc\") " Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.686989 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38813735-9677-4533-a46e-f07e0fe43cdc-kube-api-access-gzvmd" (OuterVolumeSpecName: "kube-api-access-gzvmd") pod "38813735-9677-4533-a46e-f07e0fe43cdc" (UID: "38813735-9677-4533-a46e-f07e0fe43cdc"). InnerVolumeSpecName "kube-api-access-gzvmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.717530 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "38813735-9677-4533-a46e-f07e0fe43cdc" (UID: "38813735-9677-4533-a46e-f07e0fe43cdc"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.722457 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "38813735-9677-4533-a46e-f07e0fe43cdc" (UID: "38813735-9677-4533-a46e-f07e0fe43cdc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.784218 4909 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.784270 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38813735-9677-4533-a46e-f07e0fe43cdc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:18 crc kubenswrapper[4909]: I1002 18:53:18.784293 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzvmd\" (UniqueName: \"kubernetes.io/projected/38813735-9677-4533-a46e-f07e0fe43cdc-kube-api-access-gzvmd\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.121696 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" event={"ID":"38813735-9677-4533-a46e-f07e0fe43cdc","Type":"ContainerDied","Data":"558c59bebfac33cbe8722cb75ac34b33a82816189716a758940ad27aefac15e9"} Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.121843 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="558c59bebfac33cbe8722cb75ac34b33a82816189716a758940ad27aefac15e9" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.121784 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n9x86" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232190 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x"] Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232697 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38813735-9677-4533-a46e-f07e0fe43cdc" containerName="ssh-known-hosts-edpm-deployment" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232720 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="38813735-9677-4533-a46e-f07e0fe43cdc" containerName="ssh-known-hosts-edpm-deployment" Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232736 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="registry-server" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232744 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="registry-server" Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232758 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="registry-server" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232766 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="registry-server" Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232789 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="extract-content" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232797 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="extract-content" Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232817 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="extract-utilities" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232826 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="extract-utilities" Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232842 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="extract-content" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232850 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="extract-content" Oct 02 18:53:19 crc kubenswrapper[4909]: E1002 18:53:19.232892 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="extract-utilities" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.232902 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="extract-utilities" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.233295 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="38813735-9677-4533-a46e-f07e0fe43cdc" containerName="ssh-known-hosts-edpm-deployment" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.233330 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1f5c84-3099-46fd-8901-91ce9a9fced1" containerName="registry-server" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.233363 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c201f8-0f55-4cfe-bf54-fe1d9e2bfe13" containerName="registry-server" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.234464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.237764 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.241754 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.241848 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.242392 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.252098 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x"] Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.294581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.294733 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p29wq\" (UniqueName: \"kubernetes.io/projected/1a8580e1-88af-49f4-aef1-f703b4a72b62-kube-api-access-p29wq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.294803 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.396889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.397334 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p29wq\" (UniqueName: \"kubernetes.io/projected/1a8580e1-88af-49f4-aef1-f703b4a72b62-kube-api-access-p29wq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.397494 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.400548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.402914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.418932 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p29wq\" (UniqueName: \"kubernetes.io/projected/1a8580e1-88af-49f4-aef1-f703b4a72b62-kube-api-access-p29wq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xmv8x\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:19 crc kubenswrapper[4909]: I1002 18:53:19.568447 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:20 crc kubenswrapper[4909]: I1002 18:53:20.191618 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x"] Oct 02 18:53:21 crc kubenswrapper[4909]: I1002 18:53:21.143334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" event={"ID":"1a8580e1-88af-49f4-aef1-f703b4a72b62","Type":"ContainerStarted","Data":"3ac0ddb8e540d693fa0813f9bd7573815a73b00e548612fe4e8146966c3d1efc"} Oct 02 18:53:22 crc kubenswrapper[4909]: I1002 18:53:22.154463 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" event={"ID":"1a8580e1-88af-49f4-aef1-f703b4a72b62","Type":"ContainerStarted","Data":"c6037637875077f0f9852518f5a97ac57dcac3220d24ca93d3c4314812ea1a6e"} Oct 02 18:53:22 crc kubenswrapper[4909]: I1002 18:53:22.181009 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" podStartSLOduration=2.432138595 podStartE2EDuration="3.180983686s" podCreationTimestamp="2025-10-02 18:53:19 +0000 UTC" firstStartedPulling="2025-10-02 18:53:20.201703605 +0000 UTC m=+2121.389199494" lastFinishedPulling="2025-10-02 18:53:20.950548676 +0000 UTC m=+2122.138044585" observedRunningTime="2025-10-02 18:53:22.176041791 +0000 UTC m=+2123.363537680" watchObservedRunningTime="2025-10-02 18:53:22.180983686 +0000 UTC m=+2123.368479555" Oct 02 18:53:22 crc kubenswrapper[4909]: I1002 18:53:22.653300 4909 scope.go:117] "RemoveContainer" containerID="fc554b813fc5457dfb32ffaf5674d6505d15b95445b6608809fadc3bd741baae" Oct 02 18:53:30 crc kubenswrapper[4909]: I1002 18:53:30.246986 4909 generic.go:334] "Generic (PLEG): container finished" podID="1a8580e1-88af-49f4-aef1-f703b4a72b62" containerID="c6037637875077f0f9852518f5a97ac57dcac3220d24ca93d3c4314812ea1a6e" exitCode=0 Oct 02 18:53:30 crc kubenswrapper[4909]: I1002 18:53:30.247061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" event={"ID":"1a8580e1-88af-49f4-aef1-f703b4a72b62","Type":"ContainerDied","Data":"c6037637875077f0f9852518f5a97ac57dcac3220d24ca93d3c4314812ea1a6e"} Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.789396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.877904 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-ssh-key\") pod \"1a8580e1-88af-49f4-aef1-f703b4a72b62\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.878135 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-inventory\") pod \"1a8580e1-88af-49f4-aef1-f703b4a72b62\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.878318 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p29wq\" (UniqueName: \"kubernetes.io/projected/1a8580e1-88af-49f4-aef1-f703b4a72b62-kube-api-access-p29wq\") pod \"1a8580e1-88af-49f4-aef1-f703b4a72b62\" (UID: \"1a8580e1-88af-49f4-aef1-f703b4a72b62\") " Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.885481 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8580e1-88af-49f4-aef1-f703b4a72b62-kube-api-access-p29wq" (OuterVolumeSpecName: "kube-api-access-p29wq") pod "1a8580e1-88af-49f4-aef1-f703b4a72b62" (UID: "1a8580e1-88af-49f4-aef1-f703b4a72b62"). InnerVolumeSpecName "kube-api-access-p29wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.912983 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-inventory" (OuterVolumeSpecName: "inventory") pod "1a8580e1-88af-49f4-aef1-f703b4a72b62" (UID: "1a8580e1-88af-49f4-aef1-f703b4a72b62"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.936878 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a8580e1-88af-49f4-aef1-f703b4a72b62" (UID: "1a8580e1-88af-49f4-aef1-f703b4a72b62"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.981134 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p29wq\" (UniqueName: \"kubernetes.io/projected/1a8580e1-88af-49f4-aef1-f703b4a72b62-kube-api-access-p29wq\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.981171 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:31 crc kubenswrapper[4909]: I1002 18:53:31.981188 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a8580e1-88af-49f4-aef1-f703b4a72b62-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.281099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" event={"ID":"1a8580e1-88af-49f4-aef1-f703b4a72b62","Type":"ContainerDied","Data":"3ac0ddb8e540d693fa0813f9bd7573815a73b00e548612fe4e8146966c3d1efc"} Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.281147 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ac0ddb8e540d693fa0813f9bd7573815a73b00e548612fe4e8146966c3d1efc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.281209 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.431833 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc"] Oct 02 18:53:32 crc kubenswrapper[4909]: E1002 18:53:32.432581 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8580e1-88af-49f4-aef1-f703b4a72b62" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.432610 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8580e1-88af-49f4-aef1-f703b4a72b62" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.432929 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8580e1-88af-49f4-aef1-f703b4a72b62" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.433911 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.435819 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.437416 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.437446 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.437636 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.451378 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc"] Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.491202 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svd99\" (UniqueName: \"kubernetes.io/projected/2d4c5c95-6799-4fa2-a07e-589509ca4f32-kube-api-access-svd99\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.491419 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.491494 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.593875 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.594023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.594242 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svd99\" (UniqueName: \"kubernetes.io/projected/2d4c5c95-6799-4fa2-a07e-589509ca4f32-kube-api-access-svd99\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.597844 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.599124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.612654 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svd99\" (UniqueName: \"kubernetes.io/projected/2d4c5c95-6799-4fa2-a07e-589509ca4f32-kube-api-access-svd99\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:32 crc kubenswrapper[4909]: I1002 18:53:32.758273 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:33 crc kubenswrapper[4909]: I1002 18:53:33.304146 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc"] Oct 02 18:53:33 crc kubenswrapper[4909]: W1002 18:53:33.311863 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d4c5c95_6799_4fa2_a07e_589509ca4f32.slice/crio-f27ada28c6b2f6f493460f1ff60f87bab9ecc619a93cd5fb07ed03512b2ec64d WatchSource:0}: Error finding container f27ada28c6b2f6f493460f1ff60f87bab9ecc619a93cd5fb07ed03512b2ec64d: Status 404 returned error can't find the container with id f27ada28c6b2f6f493460f1ff60f87bab9ecc619a93cd5fb07ed03512b2ec64d Oct 02 18:53:34 crc kubenswrapper[4909]: I1002 18:53:34.315670 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" event={"ID":"2d4c5c95-6799-4fa2-a07e-589509ca4f32","Type":"ContainerStarted","Data":"f27ada28c6b2f6f493460f1ff60f87bab9ecc619a93cd5fb07ed03512b2ec64d"} Oct 02 18:53:35 crc kubenswrapper[4909]: I1002 18:53:35.326680 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" event={"ID":"2d4c5c95-6799-4fa2-a07e-589509ca4f32","Type":"ContainerStarted","Data":"63ef418178edc5628439be39c3c631404343258f5dac21d1e7510d2db85476b8"} Oct 02 18:53:35 crc kubenswrapper[4909]: I1002 18:53:35.351448 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" podStartSLOduration=2.550953018 podStartE2EDuration="3.351416548s" podCreationTimestamp="2025-10-02 18:53:32 +0000 UTC" firstStartedPulling="2025-10-02 18:53:33.315167001 +0000 UTC m=+2134.502662870" lastFinishedPulling="2025-10-02 18:53:34.115630541 +0000 UTC m=+2135.303126400" observedRunningTime="2025-10-02 18:53:35.350385466 +0000 UTC m=+2136.537881345" watchObservedRunningTime="2025-10-02 18:53:35.351416548 +0000 UTC m=+2136.538912437" Oct 02 18:53:45 crc kubenswrapper[4909]: I1002 18:53:45.445410 4909 generic.go:334] "Generic (PLEG): container finished" podID="2d4c5c95-6799-4fa2-a07e-589509ca4f32" containerID="63ef418178edc5628439be39c3c631404343258f5dac21d1e7510d2db85476b8" exitCode=0 Oct 02 18:53:45 crc kubenswrapper[4909]: I1002 18:53:45.445480 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" event={"ID":"2d4c5c95-6799-4fa2-a07e-589509ca4f32","Type":"ContainerDied","Data":"63ef418178edc5628439be39c3c631404343258f5dac21d1e7510d2db85476b8"} Oct 02 18:53:46 crc kubenswrapper[4909]: I1002 18:53:46.977420 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.119692 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-inventory\") pod \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.119810 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svd99\" (UniqueName: \"kubernetes.io/projected/2d4c5c95-6799-4fa2-a07e-589509ca4f32-kube-api-access-svd99\") pod \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.120330 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-ssh-key\") pod \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\" (UID: \"2d4c5c95-6799-4fa2-a07e-589509ca4f32\") " Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.124463 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d4c5c95-6799-4fa2-a07e-589509ca4f32-kube-api-access-svd99" (OuterVolumeSpecName: "kube-api-access-svd99") pod "2d4c5c95-6799-4fa2-a07e-589509ca4f32" (UID: "2d4c5c95-6799-4fa2-a07e-589509ca4f32"). InnerVolumeSpecName "kube-api-access-svd99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.147128 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-inventory" (OuterVolumeSpecName: "inventory") pod "2d4c5c95-6799-4fa2-a07e-589509ca4f32" (UID: "2d4c5c95-6799-4fa2-a07e-589509ca4f32"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.151220 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2d4c5c95-6799-4fa2-a07e-589509ca4f32" (UID: "2d4c5c95-6799-4fa2-a07e-589509ca4f32"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.223525 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.223583 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svd99\" (UniqueName: \"kubernetes.io/projected/2d4c5c95-6799-4fa2-a07e-589509ca4f32-kube-api-access-svd99\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.223604 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c5c95-6799-4fa2-a07e-589509ca4f32-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.469835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" event={"ID":"2d4c5c95-6799-4fa2-a07e-589509ca4f32","Type":"ContainerDied","Data":"f27ada28c6b2f6f493460f1ff60f87bab9ecc619a93cd5fb07ed03512b2ec64d"} Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.470213 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27ada28c6b2f6f493460f1ff60f87bab9ecc619a93cd5fb07ed03512b2ec64d" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.469945 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.585509 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx"] Oct 02 18:53:47 crc kubenswrapper[4909]: E1002 18:53:47.585955 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d4c5c95-6799-4fa2-a07e-589509ca4f32" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.585980 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4c5c95-6799-4fa2-a07e-589509ca4f32" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.586263 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d4c5c95-6799-4fa2-a07e-589509ca4f32" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.587090 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592213 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592376 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592684 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592737 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592819 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.592826 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.593514 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.596737 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx"] Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.744135 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chkjz\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-kube-api-access-chkjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.744193 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.744331 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.745662 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.745727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.745767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.745812 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.745849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.745944 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.746109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.746185 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.746204 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.746254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847461 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847563 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847596 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chkjz\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-kube-api-access-chkjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847671 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847793 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847810 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.847852 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.854282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.854652 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.855532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.855570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.855914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.856087 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.856127 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.856625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.857399 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.857812 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.859276 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.860308 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.873699 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chkjz\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-kube-api-access-chkjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:47 crc kubenswrapper[4909]: I1002 18:53:47.909755 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:53:48 crc kubenswrapper[4909]: I1002 18:53:48.499071 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx"] Oct 02 18:53:49 crc kubenswrapper[4909]: I1002 18:53:49.494614 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" event={"ID":"4c49ad32-8f28-4e88-8560-929da446dcdf","Type":"ContainerStarted","Data":"62e2688fe86e782c218cb6d6dca73b06e10fd1c0adcb9e1f7596944783d31de5"} Oct 02 18:53:50 crc kubenswrapper[4909]: I1002 18:53:50.515398 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" event={"ID":"4c49ad32-8f28-4e88-8560-929da446dcdf","Type":"ContainerStarted","Data":"d77689118487f1d0d5239c83f4293f549775fe36da20ba11f7cc9d070ca90592"} Oct 02 18:53:50 crc kubenswrapper[4909]: I1002 18:53:50.544789 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" podStartSLOduration=2.288600549 podStartE2EDuration="3.544769137s" podCreationTimestamp="2025-10-02 18:53:47 +0000 UTC" firstStartedPulling="2025-10-02 18:53:48.50858726 +0000 UTC m=+2149.696083119" lastFinishedPulling="2025-10-02 18:53:49.764755808 +0000 UTC m=+2150.952251707" observedRunningTime="2025-10-02 18:53:50.538497401 +0000 UTC m=+2151.725993280" watchObservedRunningTime="2025-10-02 18:53:50.544769137 +0000 UTC m=+2151.732265006" Oct 02 18:54:27 crc kubenswrapper[4909]: I1002 18:54:27.043193 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-hfpp7"] Oct 02 18:54:27 crc kubenswrapper[4909]: I1002 18:54:27.055132 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-hfpp7"] Oct 02 18:54:27 crc kubenswrapper[4909]: I1002 18:54:27.642455 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337abd66-4416-4bbb-9ee9-29e704fabd94" path="/var/lib/kubelet/pods/337abd66-4416-4bbb-9ee9-29e704fabd94/volumes" Oct 02 18:54:34 crc kubenswrapper[4909]: I1002 18:54:34.060690 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c49ad32-8f28-4e88-8560-929da446dcdf" containerID="d77689118487f1d0d5239c83f4293f549775fe36da20ba11f7cc9d070ca90592" exitCode=0 Oct 02 18:54:34 crc kubenswrapper[4909]: I1002 18:54:34.060815 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" event={"ID":"4c49ad32-8f28-4e88-8560-929da446dcdf","Type":"ContainerDied","Data":"d77689118487f1d0d5239c83f4293f549775fe36da20ba11f7cc9d070ca90592"} Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.603862 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795343 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795406 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-libvirt-combined-ca-bundle\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795471 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795534 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-bootstrap-combined-ca-bundle\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795720 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-combined-ca-bundle\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795792 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795839 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-inventory\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795895 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-repo-setup-combined-ca-bundle\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.795932 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chkjz\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-kube-api-access-chkjz\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.796060 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.796111 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ovn-combined-ca-bundle\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.796151 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ssh-key\") pod \"4c49ad32-8f28-4e88-8560-929da446dcdf\" (UID: \"4c49ad32-8f28-4e88-8560-929da446dcdf\") " Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.802333 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.804720 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.804785 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.804804 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.804799 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.805248 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.805343 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-kube-api-access-chkjz" (OuterVolumeSpecName: "kube-api-access-chkjz") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "kube-api-access-chkjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.806971 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.808462 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.810212 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.815221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.831511 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.836649 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-inventory" (OuterVolumeSpecName: "inventory") pod "4c49ad32-8f28-4e88-8560-929da446dcdf" (UID: "4c49ad32-8f28-4e88-8560-929da446dcdf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899055 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899082 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899097 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chkjz\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-kube-api-access-chkjz\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899112 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899124 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899136 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899148 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899158 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899169 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899181 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899193 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899205 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c49ad32-8f28-4e88-8560-929da446dcdf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:35 crc kubenswrapper[4909]: I1002 18:54:35.899217 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c49ad32-8f28-4e88-8560-929da446dcdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.086136 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" event={"ID":"4c49ad32-8f28-4e88-8560-929da446dcdf","Type":"ContainerDied","Data":"62e2688fe86e782c218cb6d6dca73b06e10fd1c0adcb9e1f7596944783d31de5"} Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.086472 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e2688fe86e782c218cb6d6dca73b06e10fd1c0adcb9e1f7596944783d31de5" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.086233 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.244217 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd"] Oct 02 18:54:36 crc kubenswrapper[4909]: E1002 18:54:36.245316 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c49ad32-8f28-4e88-8560-929da446dcdf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.245351 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c49ad32-8f28-4e88-8560-929da446dcdf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.245688 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c49ad32-8f28-4e88-8560-929da446dcdf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.246933 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.250704 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.250931 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.251245 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.251282 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.251619 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.286626 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd"] Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.305488 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.305631 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.305699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.305735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.305755 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4ck\" (UniqueName: \"kubernetes.io/projected/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-kube-api-access-5w4ck\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.407050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.407260 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.407475 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.407554 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.407598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4ck\" (UniqueName: \"kubernetes.io/projected/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-kube-api-access-5w4ck\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.409354 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.412558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.412686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.415970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.443529 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4ck\" (UniqueName: \"kubernetes.io/projected/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-kube-api-access-5w4ck\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gg9gd\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:36 crc kubenswrapper[4909]: I1002 18:54:36.571531 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:54:37 crc kubenswrapper[4909]: I1002 18:54:37.159977 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd"] Oct 02 18:54:38 crc kubenswrapper[4909]: I1002 18:54:38.113212 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" event={"ID":"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5","Type":"ContainerStarted","Data":"868dd6de192e3a9122db3d4166dbaf4f08a655a236b6e8b8e583d7f975bd658c"} Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.133184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" event={"ID":"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5","Type":"ContainerStarted","Data":"3328904eeb1e5fc4a29b42438c52fb135aa466dbbe400cf8f31105cd3a66a05e"} Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.158898 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" podStartSLOduration=2.109738945 podStartE2EDuration="3.158879859s" podCreationTimestamp="2025-10-02 18:54:36 +0000 UTC" firstStartedPulling="2025-10-02 18:54:37.156962835 +0000 UTC m=+2198.344458704" lastFinishedPulling="2025-10-02 18:54:38.206103749 +0000 UTC m=+2199.393599618" observedRunningTime="2025-10-02 18:54:39.152789879 +0000 UTC m=+2200.340285738" watchObservedRunningTime="2025-10-02 18:54:39.158879859 +0000 UTC m=+2200.346375718" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.397387 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q52sw"] Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.400631 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.471142 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q52sw"] Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.473598 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-catalog-content\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.473694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-utilities\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.473920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrkm\" (UniqueName: \"kubernetes.io/projected/8587be7c-e6e3-450c-976e-2b67cb91c352-kube-api-access-khrkm\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.575218 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khrkm\" (UniqueName: \"kubernetes.io/projected/8587be7c-e6e3-450c-976e-2b67cb91c352-kube-api-access-khrkm\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.575387 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-catalog-content\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.575439 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-utilities\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.575916 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-utilities\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.576481 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-catalog-content\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.612878 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrkm\" (UniqueName: \"kubernetes.io/projected/8587be7c-e6e3-450c-976e-2b67cb91c352-kube-api-access-khrkm\") pod \"certified-operators-q52sw\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:39 crc kubenswrapper[4909]: I1002 18:54:39.733409 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:40 crc kubenswrapper[4909]: W1002 18:54:40.308935 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8587be7c_e6e3_450c_976e_2b67cb91c352.slice/crio-7251816a30095ff0f080c34797c153be61c6add7804652f8ae071708abaeadf0 WatchSource:0}: Error finding container 7251816a30095ff0f080c34797c153be61c6add7804652f8ae071708abaeadf0: Status 404 returned error can't find the container with id 7251816a30095ff0f080c34797c153be61c6add7804652f8ae071708abaeadf0 Oct 02 18:54:40 crc kubenswrapper[4909]: I1002 18:54:40.314850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q52sw"] Oct 02 18:54:41 crc kubenswrapper[4909]: I1002 18:54:41.159389 4909 generic.go:334] "Generic (PLEG): container finished" podID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerID="c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3" exitCode=0 Oct 02 18:54:41 crc kubenswrapper[4909]: I1002 18:54:41.159480 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerDied","Data":"c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3"} Oct 02 18:54:41 crc kubenswrapper[4909]: I1002 18:54:41.159780 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerStarted","Data":"7251816a30095ff0f080c34797c153be61c6add7804652f8ae071708abaeadf0"} Oct 02 18:54:44 crc kubenswrapper[4909]: I1002 18:54:44.192762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerStarted","Data":"f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4"} Oct 02 18:54:49 crc kubenswrapper[4909]: I1002 18:54:49.269378 4909 generic.go:334] "Generic (PLEG): container finished" podID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerID="f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4" exitCode=0 Oct 02 18:54:49 crc kubenswrapper[4909]: I1002 18:54:49.269477 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerDied","Data":"f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4"} Oct 02 18:54:58 crc kubenswrapper[4909]: I1002 18:54:58.367348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerStarted","Data":"b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82"} Oct 02 18:54:58 crc kubenswrapper[4909]: I1002 18:54:58.392290 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q52sw" podStartSLOduration=3.2199483620000002 podStartE2EDuration="19.392270875s" podCreationTimestamp="2025-10-02 18:54:39 +0000 UTC" firstStartedPulling="2025-10-02 18:54:41.163083976 +0000 UTC m=+2202.350579845" lastFinishedPulling="2025-10-02 18:54:57.335406509 +0000 UTC m=+2218.522902358" observedRunningTime="2025-10-02 18:54:58.391081088 +0000 UTC m=+2219.578576967" watchObservedRunningTime="2025-10-02 18:54:58.392270875 +0000 UTC m=+2219.579766744" Oct 02 18:54:59 crc kubenswrapper[4909]: I1002 18:54:59.733821 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:54:59 crc kubenswrapper[4909]: I1002 18:54:59.734998 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:55:00 crc kubenswrapper[4909]: I1002 18:55:00.784319 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-q52sw" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="registry-server" probeResult="failure" output=< Oct 02 18:55:00 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 18:55:00 crc kubenswrapper[4909]: > Oct 02 18:55:09 crc kubenswrapper[4909]: I1002 18:55:09.821380 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:55:09 crc kubenswrapper[4909]: I1002 18:55:09.920192 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:55:10 crc kubenswrapper[4909]: I1002 18:55:10.060957 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-h9bx9"] Oct 02 18:55:10 crc kubenswrapper[4909]: I1002 18:55:10.077818 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-h9bx9"] Oct 02 18:55:10 crc kubenswrapper[4909]: I1002 18:55:10.606132 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q52sw"] Oct 02 18:55:11 crc kubenswrapper[4909]: I1002 18:55:11.526677 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q52sw" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="registry-server" containerID="cri-o://b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82" gracePeriod=2 Oct 02 18:55:11 crc kubenswrapper[4909]: I1002 18:55:11.628126 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cdb0d6-c8c2-4646-8350-f63892d098f5" path="/var/lib/kubelet/pods/52cdb0d6-c8c2-4646-8350-f63892d098f5/volumes" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.059131 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.130823 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-utilities\") pod \"8587be7c-e6e3-450c-976e-2b67cb91c352\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.131314 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khrkm\" (UniqueName: \"kubernetes.io/projected/8587be7c-e6e3-450c-976e-2b67cb91c352-kube-api-access-khrkm\") pod \"8587be7c-e6e3-450c-976e-2b67cb91c352\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.131492 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-catalog-content\") pod \"8587be7c-e6e3-450c-976e-2b67cb91c352\" (UID: \"8587be7c-e6e3-450c-976e-2b67cb91c352\") " Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.132457 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-utilities" (OuterVolumeSpecName: "utilities") pod "8587be7c-e6e3-450c-976e-2b67cb91c352" (UID: "8587be7c-e6e3-450c-976e-2b67cb91c352"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.132735 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.141309 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8587be7c-e6e3-450c-976e-2b67cb91c352-kube-api-access-khrkm" (OuterVolumeSpecName: "kube-api-access-khrkm") pod "8587be7c-e6e3-450c-976e-2b67cb91c352" (UID: "8587be7c-e6e3-450c-976e-2b67cb91c352"). InnerVolumeSpecName "kube-api-access-khrkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.201764 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8587be7c-e6e3-450c-976e-2b67cb91c352" (UID: "8587be7c-e6e3-450c-976e-2b67cb91c352"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.234790 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khrkm\" (UniqueName: \"kubernetes.io/projected/8587be7c-e6e3-450c-976e-2b67cb91c352-kube-api-access-khrkm\") on node \"crc\" DevicePath \"\"" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.234822 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8587be7c-e6e3-450c-976e-2b67cb91c352-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.558478 4909 generic.go:334] "Generic (PLEG): container finished" podID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerID="b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82" exitCode=0 Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.558713 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerDied","Data":"b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82"} Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.558748 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q52sw" event={"ID":"8587be7c-e6e3-450c-976e-2b67cb91c352","Type":"ContainerDied","Data":"7251816a30095ff0f080c34797c153be61c6add7804652f8ae071708abaeadf0"} Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.558751 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q52sw" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.558771 4909 scope.go:117] "RemoveContainer" containerID="b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.604880 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q52sw"] Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.605378 4909 scope.go:117] "RemoveContainer" containerID="f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.616794 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q52sw"] Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.654503 4909 scope.go:117] "RemoveContainer" containerID="c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.725683 4909 scope.go:117] "RemoveContainer" containerID="b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82" Oct 02 18:55:12 crc kubenswrapper[4909]: E1002 18:55:12.726488 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82\": container with ID starting with b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82 not found: ID does not exist" containerID="b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.726586 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82"} err="failed to get container status \"b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82\": rpc error: code = NotFound desc = could not find container \"b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82\": container with ID starting with b9fa736330dd265eef7ad2847fc2d562e501258e71dbb68434213079adfbeb82 not found: ID does not exist" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.726631 4909 scope.go:117] "RemoveContainer" containerID="f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4" Oct 02 18:55:12 crc kubenswrapper[4909]: E1002 18:55:12.727341 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4\": container with ID starting with f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4 not found: ID does not exist" containerID="f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.727422 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4"} err="failed to get container status \"f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4\": rpc error: code = NotFound desc = could not find container \"f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4\": container with ID starting with f3826c19739adab23eecce8fed5524d22840d2f68d57ee1e03a1dbf82ee54eb4 not found: ID does not exist" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.727481 4909 scope.go:117] "RemoveContainer" containerID="c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3" Oct 02 18:55:12 crc kubenswrapper[4909]: E1002 18:55:12.728067 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3\": container with ID starting with c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3 not found: ID does not exist" containerID="c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3" Oct 02 18:55:12 crc kubenswrapper[4909]: I1002 18:55:12.728108 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3"} err="failed to get container status \"c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3\": rpc error: code = NotFound desc = could not find container \"c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3\": container with ID starting with c9cc7b7c8f70ba558f64eee50cfc77e0bdfa208625722df64a86c960e71da6d3 not found: ID does not exist" Oct 02 18:55:13 crc kubenswrapper[4909]: I1002 18:55:13.629021 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" path="/var/lib/kubelet/pods/8587be7c-e6e3-450c-976e-2b67cb91c352/volumes" Oct 02 18:55:22 crc kubenswrapper[4909]: I1002 18:55:22.842232 4909 scope.go:117] "RemoveContainer" containerID="7273aaf94f67fdc171eda0bb780fcd929bfe9ee12c8227aa61bb9a334adc6bef" Oct 02 18:55:22 crc kubenswrapper[4909]: I1002 18:55:22.884790 4909 scope.go:117] "RemoveContainer" containerID="06cc032cbcea5c5a595809e2f3dd4fc182f75930682766e85adc4866a41cafab" Oct 02 18:55:23 crc kubenswrapper[4909]: I1002 18:55:23.054740 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:55:23 crc kubenswrapper[4909]: I1002 18:55:23.054799 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:55:53 crc kubenswrapper[4909]: I1002 18:55:53.054588 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:55:53 crc kubenswrapper[4909]: I1002 18:55:53.056291 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:56:00 crc kubenswrapper[4909]: I1002 18:56:00.137137 4909 generic.go:334] "Generic (PLEG): container finished" podID="36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" containerID="3328904eeb1e5fc4a29b42438c52fb135aa466dbbe400cf8f31105cd3a66a05e" exitCode=0 Oct 02 18:56:00 crc kubenswrapper[4909]: I1002 18:56:00.137228 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" event={"ID":"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5","Type":"ContainerDied","Data":"3328904eeb1e5fc4a29b42438c52fb135aa466dbbe400cf8f31105cd3a66a05e"} Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.780996 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.813590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovn-combined-ca-bundle\") pod \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.813860 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovncontroller-config-0\") pod \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.813956 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w4ck\" (UniqueName: \"kubernetes.io/projected/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-kube-api-access-5w4ck\") pod \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.813995 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ssh-key\") pod \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.814081 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-inventory\") pod \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\" (UID: \"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5\") " Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.821969 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-kube-api-access-5w4ck" (OuterVolumeSpecName: "kube-api-access-5w4ck") pod "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" (UID: "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5"). InnerVolumeSpecName "kube-api-access-5w4ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.821967 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" (UID: "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.846720 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" (UID: "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.852722 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" (UID: "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.855964 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-inventory" (OuterVolumeSpecName: "inventory") pod "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" (UID: "36c4281e-6d35-4eef-8ac9-a86f5cd35fc5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.916244 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w4ck\" (UniqueName: \"kubernetes.io/projected/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-kube-api-access-5w4ck\") on node \"crc\" DevicePath \"\"" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.916281 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.916291 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.916303 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 18:56:01 crc kubenswrapper[4909]: I1002 18:56:01.916313 4909 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.173050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" event={"ID":"36c4281e-6d35-4eef-8ac9-a86f5cd35fc5","Type":"ContainerDied","Data":"868dd6de192e3a9122db3d4166dbaf4f08a655a236b6e8b8e583d7f975bd658c"} Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.173097 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="868dd6de192e3a9122db3d4166dbaf4f08a655a236b6e8b8e583d7f975bd658c" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.173172 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.334991 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd"] Oct 02 18:56:02 crc kubenswrapper[4909]: E1002 18:56:02.335770 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="extract-content" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.335861 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="extract-content" Oct 02 18:56:02 crc kubenswrapper[4909]: E1002 18:56:02.335952 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="registry-server" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.336068 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="registry-server" Oct 02 18:56:02 crc kubenswrapper[4909]: E1002 18:56:02.336158 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.336225 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 18:56:02 crc kubenswrapper[4909]: E1002 18:56:02.336304 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="extract-utilities" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.336369 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="extract-utilities" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.336704 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8587be7c-e6e3-450c-976e-2b67cb91c352" containerName="registry-server" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.336794 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.337812 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.342494 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.342832 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.343447 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.344344 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.345219 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.363326 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd"] Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.428651 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.428721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.428757 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tz5d\" (UniqueName: \"kubernetes.io/projected/5124cbe9-04cf-4218-b2f2-96c5286e873b-kube-api-access-7tz5d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.428824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.428885 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.530894 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.531411 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.531513 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.531660 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tz5d\" (UniqueName: \"kubernetes.io/projected/5124cbe9-04cf-4218-b2f2-96c5286e873b-kube-api-access-7tz5d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.531831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.535464 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.536540 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.539286 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.541629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.548265 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tz5d\" (UniqueName: \"kubernetes.io/projected/5124cbe9-04cf-4218-b2f2-96c5286e873b-kube-api-access-7tz5d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:02 crc kubenswrapper[4909]: I1002 18:56:02.662114 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 18:56:03 crc kubenswrapper[4909]: I1002 18:56:03.203543 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd"] Oct 02 18:56:03 crc kubenswrapper[4909]: I1002 18:56:03.212309 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 18:56:04 crc kubenswrapper[4909]: I1002 18:56:04.199853 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" event={"ID":"5124cbe9-04cf-4218-b2f2-96c5286e873b","Type":"ContainerStarted","Data":"e25c9e65a883b72369e4b3459bcf8998ae45aca51bf838600a5f8c40af608d80"} Oct 02 18:56:04 crc kubenswrapper[4909]: I1002 18:56:04.200217 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" event={"ID":"5124cbe9-04cf-4218-b2f2-96c5286e873b","Type":"ContainerStarted","Data":"23b6bcd41e52ca395a19d619b7a1a1311b79d8a5a9819601fcec63c09d4a20ba"} Oct 02 18:56:04 crc kubenswrapper[4909]: I1002 18:56:04.232130 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" podStartSLOduration=1.7869870429999999 podStartE2EDuration="2.232102141s" podCreationTimestamp="2025-10-02 18:56:02 +0000 UTC" firstStartedPulling="2025-10-02 18:56:03.212057987 +0000 UTC m=+2284.399553856" lastFinishedPulling="2025-10-02 18:56:03.657173055 +0000 UTC m=+2284.844668954" observedRunningTime="2025-10-02 18:56:04.217779193 +0000 UTC m=+2285.405275062" watchObservedRunningTime="2025-10-02 18:56:04.232102141 +0000 UTC m=+2285.419598010" Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.054807 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.055601 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.055672 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.056831 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.056918 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" gracePeriod=600 Oct 02 18:56:23 crc kubenswrapper[4909]: E1002 18:56:23.184286 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.466085 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" exitCode=0 Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.466202 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172"} Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.466653 4909 scope.go:117] "RemoveContainer" containerID="a91abf9faef5ed8796c146ef7a0e1bf6166a28a8a51bea1801fb3fe8045d6573" Oct 02 18:56:23 crc kubenswrapper[4909]: I1002 18:56:23.468427 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:56:23 crc kubenswrapper[4909]: E1002 18:56:23.468993 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:56:35 crc kubenswrapper[4909]: I1002 18:56:35.609238 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:56:35 crc kubenswrapper[4909]: E1002 18:56:35.610054 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:56:47 crc kubenswrapper[4909]: I1002 18:56:47.609845 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:56:47 crc kubenswrapper[4909]: E1002 18:56:47.610722 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:56:58 crc kubenswrapper[4909]: I1002 18:56:58.609684 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:56:58 crc kubenswrapper[4909]: E1002 18:56:58.610862 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.313601 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vm5fw"] Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.319058 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.327521 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vm5fw"] Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.429052 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjs88\" (UniqueName: \"kubernetes.io/projected/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-kube-api-access-bjs88\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.429152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-catalog-content\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.429227 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-utilities\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.531368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-utilities\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.531541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjs88\" (UniqueName: \"kubernetes.io/projected/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-kube-api-access-bjs88\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.531630 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-catalog-content\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.532076 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-utilities\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.532250 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-catalog-content\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.556262 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjs88\" (UniqueName: \"kubernetes.io/projected/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-kube-api-access-bjs88\") pod \"redhat-operators-vm5fw\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:09 crc kubenswrapper[4909]: I1002 18:57:09.693778 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:10 crc kubenswrapper[4909]: I1002 18:57:10.174146 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vm5fw"] Oct 02 18:57:11 crc kubenswrapper[4909]: I1002 18:57:11.120690 4909 generic.go:334] "Generic (PLEG): container finished" podID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerID="43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b" exitCode=0 Oct 02 18:57:11 crc kubenswrapper[4909]: I1002 18:57:11.120854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerDied","Data":"43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b"} Oct 02 18:57:11 crc kubenswrapper[4909]: I1002 18:57:11.122185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerStarted","Data":"b5fee522fe3e6b13f32f31820cc8a095deeaf764637e5225f2e34f9b9c99b13e"} Oct 02 18:57:11 crc kubenswrapper[4909]: I1002 18:57:11.611127 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:57:11 crc kubenswrapper[4909]: E1002 18:57:11.611742 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:57:13 crc kubenswrapper[4909]: I1002 18:57:13.150452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerStarted","Data":"880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5"} Oct 02 18:57:14 crc kubenswrapper[4909]: I1002 18:57:14.165534 4909 generic.go:334] "Generic (PLEG): container finished" podID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerID="880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5" exitCode=0 Oct 02 18:57:14 crc kubenswrapper[4909]: I1002 18:57:14.165597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerDied","Data":"880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5"} Oct 02 18:57:15 crc kubenswrapper[4909]: I1002 18:57:15.178248 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerStarted","Data":"1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174"} Oct 02 18:57:15 crc kubenswrapper[4909]: I1002 18:57:15.214567 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vm5fw" podStartSLOduration=2.779215574 podStartE2EDuration="6.214534679s" podCreationTimestamp="2025-10-02 18:57:09 +0000 UTC" firstStartedPulling="2025-10-02 18:57:11.122854892 +0000 UTC m=+2352.310350751" lastFinishedPulling="2025-10-02 18:57:14.558173997 +0000 UTC m=+2355.745669856" observedRunningTime="2025-10-02 18:57:15.205660871 +0000 UTC m=+2356.393156750" watchObservedRunningTime="2025-10-02 18:57:15.214534679 +0000 UTC m=+2356.402030578" Oct 02 18:57:19 crc kubenswrapper[4909]: I1002 18:57:19.694971 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:19 crc kubenswrapper[4909]: I1002 18:57:19.695719 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:19 crc kubenswrapper[4909]: I1002 18:57:19.765351 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:20 crc kubenswrapper[4909]: I1002 18:57:20.310505 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:20 crc kubenswrapper[4909]: I1002 18:57:20.375398 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vm5fw"] Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.272484 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vm5fw" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="registry-server" containerID="cri-o://1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174" gracePeriod=2 Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.848522 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.957364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjs88\" (UniqueName: \"kubernetes.io/projected/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-kube-api-access-bjs88\") pod \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.957478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-utilities\") pod \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.957623 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-catalog-content\") pod \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\" (UID: \"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe\") " Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.958937 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-utilities" (OuterVolumeSpecName: "utilities") pod "1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" (UID: "1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:57:22 crc kubenswrapper[4909]: I1002 18:57:22.966449 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-kube-api-access-bjs88" (OuterVolumeSpecName: "kube-api-access-bjs88") pod "1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" (UID: "1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe"). InnerVolumeSpecName "kube-api-access-bjs88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.060198 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjs88\" (UniqueName: \"kubernetes.io/projected/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-kube-api-access-bjs88\") on node \"crc\" DevicePath \"\"" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.060463 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.065477 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" (UID: "1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.165252 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.294154 4909 generic.go:334] "Generic (PLEG): container finished" podID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerID="1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174" exitCode=0 Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.294243 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerDied","Data":"1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174"} Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.294358 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm5fw" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.294395 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm5fw" event={"ID":"1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe","Type":"ContainerDied","Data":"b5fee522fe3e6b13f32f31820cc8a095deeaf764637e5225f2e34f9b9c99b13e"} Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.294439 4909 scope.go:117] "RemoveContainer" containerID="1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.321615 4909 scope.go:117] "RemoveContainer" containerID="880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.354013 4909 scope.go:117] "RemoveContainer" containerID="43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.364281 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vm5fw"] Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.383611 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vm5fw"] Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.442197 4909 scope.go:117] "RemoveContainer" containerID="1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174" Oct 02 18:57:23 crc kubenswrapper[4909]: E1002 18:57:23.442788 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174\": container with ID starting with 1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174 not found: ID does not exist" containerID="1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.442842 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174"} err="failed to get container status \"1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174\": rpc error: code = NotFound desc = could not find container \"1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174\": container with ID starting with 1bc5f78e1594bbcfc412e882e049f3271ef141870e0ec66c19b1149e19cc2174 not found: ID does not exist" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.442877 4909 scope.go:117] "RemoveContainer" containerID="880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5" Oct 02 18:57:23 crc kubenswrapper[4909]: E1002 18:57:23.443395 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5\": container with ID starting with 880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5 not found: ID does not exist" containerID="880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.443436 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5"} err="failed to get container status \"880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5\": rpc error: code = NotFound desc = could not find container \"880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5\": container with ID starting with 880f5de56fd97e81a42c7d1dbc93c8913b4968e06e5d69c87505313d2154c7f5 not found: ID does not exist" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.443463 4909 scope.go:117] "RemoveContainer" containerID="43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b" Oct 02 18:57:23 crc kubenswrapper[4909]: E1002 18:57:23.443859 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b\": container with ID starting with 43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b not found: ID does not exist" containerID="43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.443904 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b"} err="failed to get container status \"43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b\": rpc error: code = NotFound desc = could not find container \"43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b\": container with ID starting with 43119106f94fcfea4953d8f8db26577f3e835803c2680b021b48217dc6682f0b not found: ID does not exist" Oct 02 18:57:23 crc kubenswrapper[4909]: I1002 18:57:23.624133 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" path="/var/lib/kubelet/pods/1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe/volumes" Oct 02 18:57:26 crc kubenswrapper[4909]: I1002 18:57:26.609766 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:57:26 crc kubenswrapper[4909]: E1002 18:57:26.610334 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:57:40 crc kubenswrapper[4909]: I1002 18:57:40.609300 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:57:40 crc kubenswrapper[4909]: E1002 18:57:40.610090 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:57:52 crc kubenswrapper[4909]: I1002 18:57:52.608385 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:57:52 crc kubenswrapper[4909]: E1002 18:57:52.609223 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:58:06 crc kubenswrapper[4909]: I1002 18:58:06.609116 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:58:06 crc kubenswrapper[4909]: E1002 18:58:06.609808 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:58:17 crc kubenswrapper[4909]: I1002 18:58:17.608464 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:58:17 crc kubenswrapper[4909]: E1002 18:58:17.609373 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:58:30 crc kubenswrapper[4909]: I1002 18:58:30.609975 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:58:30 crc kubenswrapper[4909]: E1002 18:58:30.611372 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:58:43 crc kubenswrapper[4909]: I1002 18:58:43.609599 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:58:43 crc kubenswrapper[4909]: E1002 18:58:43.610816 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:58:56 crc kubenswrapper[4909]: I1002 18:58:56.611564 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:58:56 crc kubenswrapper[4909]: E1002 18:58:56.612465 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:59:11 crc kubenswrapper[4909]: I1002 18:59:11.608756 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:59:11 crc kubenswrapper[4909]: E1002 18:59:11.609615 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:59:26 crc kubenswrapper[4909]: I1002 18:59:26.608732 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:59:26 crc kubenswrapper[4909]: E1002 18:59:26.609706 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:59:41 crc kubenswrapper[4909]: I1002 18:59:41.608345 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:59:41 crc kubenswrapper[4909]: E1002 18:59:41.609269 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 18:59:56 crc kubenswrapper[4909]: I1002 18:59:56.608554 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 18:59:56 crc kubenswrapper[4909]: E1002 18:59:56.609616 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.160093 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr"] Oct 02 19:00:00 crc kubenswrapper[4909]: E1002 19:00:00.161376 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="extract-content" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.161396 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="extract-content" Oct 02 19:00:00 crc kubenswrapper[4909]: E1002 19:00:00.161430 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="registry-server" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.161438 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="registry-server" Oct 02 19:00:00 crc kubenswrapper[4909]: E1002 19:00:00.161484 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="extract-utilities" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.161493 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="extract-utilities" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.161777 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1029ad0f-a916-4d9a-ab38-2ebe9f75bfbe" containerName="registry-server" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.162668 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.166016 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.167824 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.170895 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr"] Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.248284 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8r8\" (UniqueName: \"kubernetes.io/projected/8f4cecbb-b3da-484e-be90-00ecbf10449f-kube-api-access-9b8r8\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.248588 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f4cecbb-b3da-484e-be90-00ecbf10449f-secret-volume\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.248668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f4cecbb-b3da-484e-be90-00ecbf10449f-config-volume\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.351835 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b8r8\" (UniqueName: \"kubernetes.io/projected/8f4cecbb-b3da-484e-be90-00ecbf10449f-kube-api-access-9b8r8\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.351960 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f4cecbb-b3da-484e-be90-00ecbf10449f-secret-volume\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.352082 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f4cecbb-b3da-484e-be90-00ecbf10449f-config-volume\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.353281 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f4cecbb-b3da-484e-be90-00ecbf10449f-config-volume\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.364451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f4cecbb-b3da-484e-be90-00ecbf10449f-secret-volume\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.373800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b8r8\" (UniqueName: \"kubernetes.io/projected/8f4cecbb-b3da-484e-be90-00ecbf10449f-kube-api-access-9b8r8\") pod \"collect-profiles-29323860-qrpjr\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.484781 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:00 crc kubenswrapper[4909]: I1002 19:00:00.980130 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr"] Oct 02 19:00:01 crc kubenswrapper[4909]: I1002 19:00:01.200715 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" event={"ID":"8f4cecbb-b3da-484e-be90-00ecbf10449f","Type":"ContainerStarted","Data":"5fa67be2951762c020d2d5c1e26b470539c265f9bae07512495c6b1161652d4c"} Oct 02 19:00:02 crc kubenswrapper[4909]: I1002 19:00:02.214961 4909 generic.go:334] "Generic (PLEG): container finished" podID="8f4cecbb-b3da-484e-be90-00ecbf10449f" containerID="c63922d5348508f261363f90e12cef7efc5f9dad1277b07f4660007cb35a2be3" exitCode=0 Oct 02 19:00:02 crc kubenswrapper[4909]: I1002 19:00:02.215117 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" event={"ID":"8f4cecbb-b3da-484e-be90-00ecbf10449f","Type":"ContainerDied","Data":"c63922d5348508f261363f90e12cef7efc5f9dad1277b07f4660007cb35a2be3"} Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.734579 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.831126 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f4cecbb-b3da-484e-be90-00ecbf10449f-secret-volume\") pod \"8f4cecbb-b3da-484e-be90-00ecbf10449f\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.831177 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b8r8\" (UniqueName: \"kubernetes.io/projected/8f4cecbb-b3da-484e-be90-00ecbf10449f-kube-api-access-9b8r8\") pod \"8f4cecbb-b3da-484e-be90-00ecbf10449f\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.831438 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f4cecbb-b3da-484e-be90-00ecbf10449f-config-volume\") pod \"8f4cecbb-b3da-484e-be90-00ecbf10449f\" (UID: \"8f4cecbb-b3da-484e-be90-00ecbf10449f\") " Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.832215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4cecbb-b3da-484e-be90-00ecbf10449f-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f4cecbb-b3da-484e-be90-00ecbf10449f" (UID: "8f4cecbb-b3da-484e-be90-00ecbf10449f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.837855 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4cecbb-b3da-484e-be90-00ecbf10449f-kube-api-access-9b8r8" (OuterVolumeSpecName: "kube-api-access-9b8r8") pod "8f4cecbb-b3da-484e-be90-00ecbf10449f" (UID: "8f4cecbb-b3da-484e-be90-00ecbf10449f"). InnerVolumeSpecName "kube-api-access-9b8r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.839850 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4cecbb-b3da-484e-be90-00ecbf10449f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f4cecbb-b3da-484e-be90-00ecbf10449f" (UID: "8f4cecbb-b3da-484e-be90-00ecbf10449f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.933297 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f4cecbb-b3da-484e-be90-00ecbf10449f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.933328 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f4cecbb-b3da-484e-be90-00ecbf10449f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:03 crc kubenswrapper[4909]: I1002 19:00:03.933339 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b8r8\" (UniqueName: \"kubernetes.io/projected/8f4cecbb-b3da-484e-be90-00ecbf10449f-kube-api-access-9b8r8\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:04 crc kubenswrapper[4909]: I1002 19:00:04.241608 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" event={"ID":"8f4cecbb-b3da-484e-be90-00ecbf10449f","Type":"ContainerDied","Data":"5fa67be2951762c020d2d5c1e26b470539c265f9bae07512495c6b1161652d4c"} Oct 02 19:00:04 crc kubenswrapper[4909]: I1002 19:00:04.241901 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa67be2951762c020d2d5c1e26b470539c265f9bae07512495c6b1161652d4c" Oct 02 19:00:04 crc kubenswrapper[4909]: I1002 19:00:04.241682 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr" Oct 02 19:00:04 crc kubenswrapper[4909]: I1002 19:00:04.830080 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp"] Oct 02 19:00:04 crc kubenswrapper[4909]: I1002 19:00:04.838610 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323815-2zsdp"] Oct 02 19:00:05 crc kubenswrapper[4909]: I1002 19:00:05.659819 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb18400-1a15-48bb-a28e-27d19e0dd04c" path="/var/lib/kubelet/pods/ddb18400-1a15-48bb-a28e-27d19e0dd04c/volumes" Oct 02 19:00:09 crc kubenswrapper[4909]: I1002 19:00:09.619059 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:00:09 crc kubenswrapper[4909]: E1002 19:00:09.620096 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:00:20 crc kubenswrapper[4909]: I1002 19:00:20.608392 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:00:20 crc kubenswrapper[4909]: E1002 19:00:20.609064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:00:23 crc kubenswrapper[4909]: I1002 19:00:23.117683 4909 scope.go:117] "RemoveContainer" containerID="3f88b44b3eb5951b72d82d1da8b236f7b30e9bb8c2b8e41fb795bb20f6c60500" Oct 02 19:00:32 crc kubenswrapper[4909]: I1002 19:00:32.609110 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:00:32 crc kubenswrapper[4909]: E1002 19:00:32.610003 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:00:47 crc kubenswrapper[4909]: I1002 19:00:47.608690 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:00:47 crc kubenswrapper[4909]: E1002 19:00:47.609445 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:00:53 crc kubenswrapper[4909]: I1002 19:00:53.842720 4909 generic.go:334] "Generic (PLEG): container finished" podID="5124cbe9-04cf-4218-b2f2-96c5286e873b" containerID="e25c9e65a883b72369e4b3459bcf8998ae45aca51bf838600a5f8c40af608d80" exitCode=0 Oct 02 19:00:53 crc kubenswrapper[4909]: I1002 19:00:53.842854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" event={"ID":"5124cbe9-04cf-4218-b2f2-96c5286e873b","Type":"ContainerDied","Data":"e25c9e65a883b72369e4b3459bcf8998ae45aca51bf838600a5f8c40af608d80"} Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.450180 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.593597 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-secret-0\") pod \"5124cbe9-04cf-4218-b2f2-96c5286e873b\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.593713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-ssh-key\") pod \"5124cbe9-04cf-4218-b2f2-96c5286e873b\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.593753 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tz5d\" (UniqueName: \"kubernetes.io/projected/5124cbe9-04cf-4218-b2f2-96c5286e873b-kube-api-access-7tz5d\") pod \"5124cbe9-04cf-4218-b2f2-96c5286e873b\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.593788 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-combined-ca-bundle\") pod \"5124cbe9-04cf-4218-b2f2-96c5286e873b\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.593899 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-inventory\") pod \"5124cbe9-04cf-4218-b2f2-96c5286e873b\" (UID: \"5124cbe9-04cf-4218-b2f2-96c5286e873b\") " Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.600382 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5124cbe9-04cf-4218-b2f2-96c5286e873b-kube-api-access-7tz5d" (OuterVolumeSpecName: "kube-api-access-7tz5d") pod "5124cbe9-04cf-4218-b2f2-96c5286e873b" (UID: "5124cbe9-04cf-4218-b2f2-96c5286e873b"). InnerVolumeSpecName "kube-api-access-7tz5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.601203 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5124cbe9-04cf-4218-b2f2-96c5286e873b" (UID: "5124cbe9-04cf-4218-b2f2-96c5286e873b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.628999 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-inventory" (OuterVolumeSpecName: "inventory") pod "5124cbe9-04cf-4218-b2f2-96c5286e873b" (UID: "5124cbe9-04cf-4218-b2f2-96c5286e873b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.645432 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5124cbe9-04cf-4218-b2f2-96c5286e873b" (UID: "5124cbe9-04cf-4218-b2f2-96c5286e873b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.655776 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5124cbe9-04cf-4218-b2f2-96c5286e873b" (UID: "5124cbe9-04cf-4218-b2f2-96c5286e873b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.697005 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.697087 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tz5d\" (UniqueName: \"kubernetes.io/projected/5124cbe9-04cf-4218-b2f2-96c5286e873b-kube-api-access-7tz5d\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.697102 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.697113 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.697125 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5124cbe9-04cf-4218-b2f2-96c5286e873b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.867703 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" event={"ID":"5124cbe9-04cf-4218-b2f2-96c5286e873b","Type":"ContainerDied","Data":"23b6bcd41e52ca395a19d619b7a1a1311b79d8a5a9819601fcec63c09d4a20ba"} Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.867751 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b6bcd41e52ca395a19d619b7a1a1311b79d8a5a9819601fcec63c09d4a20ba" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.867787 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.965463 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v"] Oct 02 19:00:55 crc kubenswrapper[4909]: E1002 19:00:55.966356 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4cecbb-b3da-484e-be90-00ecbf10449f" containerName="collect-profiles" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.966410 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4cecbb-b3da-484e-be90-00ecbf10449f" containerName="collect-profiles" Oct 02 19:00:55 crc kubenswrapper[4909]: E1002 19:00:55.966510 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5124cbe9-04cf-4218-b2f2-96c5286e873b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.966532 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5124cbe9-04cf-4218-b2f2-96c5286e873b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.967015 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5124cbe9-04cf-4218-b2f2-96c5286e873b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.967140 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4cecbb-b3da-484e-be90-00ecbf10449f" containerName="collect-profiles" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.968780 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.971948 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.972140 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.972332 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.972556 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:00:55 crc kubenswrapper[4909]: I1002 19:00:55.974564 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.009004 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v"] Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105408 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105455 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105599 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.105669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfkz\" (UniqueName: \"kubernetes.io/projected/2204de38-59ac-4528-b7d1-b7ab39dcc238-kube-api-access-qrfkz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207729 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207816 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfkz\" (UniqueName: \"kubernetes.io/projected/2204de38-59ac-4528-b7d1-b7ab39dcc238-kube-api-access-qrfkz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207869 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207887 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207926 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207942 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.207975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.213063 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.216457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.217586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.218420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.218547 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.219094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.223661 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfkz\" (UniqueName: \"kubernetes.io/projected/2204de38-59ac-4528-b7d1-b7ab39dcc238-kube-api-access-qrfkz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-t582v\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.318707 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:00:56 crc kubenswrapper[4909]: I1002 19:00:56.891643 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v"] Oct 02 19:00:57 crc kubenswrapper[4909]: I1002 19:00:57.894780 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" event={"ID":"2204de38-59ac-4528-b7d1-b7ab39dcc238","Type":"ContainerStarted","Data":"6aca5fbd6a801333f4fa6bb8c915bdad454825186968e3de4aee7a53ffe3cd82"} Oct 02 19:00:58 crc kubenswrapper[4909]: I1002 19:00:58.928177 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" event={"ID":"2204de38-59ac-4528-b7d1-b7ab39dcc238","Type":"ContainerStarted","Data":"6465b68ce535665bfa3c4004b1441e4aa1ba1b624ae10d066c64660c792da54c"} Oct 02 19:00:58 crc kubenswrapper[4909]: I1002 19:00:58.970203 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" podStartSLOduration=3.209424694 podStartE2EDuration="3.970174067s" podCreationTimestamp="2025-10-02 19:00:55 +0000 UTC" firstStartedPulling="2025-10-02 19:00:56.900667346 +0000 UTC m=+2578.088163245" lastFinishedPulling="2025-10-02 19:00:57.661416719 +0000 UTC m=+2578.848912618" observedRunningTime="2025-10-02 19:00:58.948764996 +0000 UTC m=+2580.136260895" watchObservedRunningTime="2025-10-02 19:00:58.970174067 +0000 UTC m=+2580.157669936" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.135286 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323861-qs6x8"] Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.137456 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.165838 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323861-qs6x8"] Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.238220 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7fp\" (UniqueName: \"kubernetes.io/projected/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-kube-api-access-dd7fp\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.238652 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-combined-ca-bundle\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.238744 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-config-data\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.238804 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-fernet-keys\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.340684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-combined-ca-bundle\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.340742 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-config-data\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.340772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-fernet-keys\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.340894 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7fp\" (UniqueName: \"kubernetes.io/projected/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-kube-api-access-dd7fp\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.348609 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-fernet-keys\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.349432 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-combined-ca-bundle\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.350338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-config-data\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.365027 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7fp\" (UniqueName: \"kubernetes.io/projected/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-kube-api-access-dd7fp\") pod \"keystone-cron-29323861-qs6x8\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.517761 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:00 crc kubenswrapper[4909]: I1002 19:01:00.609077 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:01:00 crc kubenswrapper[4909]: E1002 19:01:00.609443 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:01:01 crc kubenswrapper[4909]: I1002 19:01:01.006965 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323861-qs6x8"] Oct 02 19:01:01 crc kubenswrapper[4909]: W1002 19:01:01.007875 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c8428d5_6fb6_4e18_84b2_d5f88dd993d7.slice/crio-4603194a72d482edb560b0d333d29a53ea317a7e0f739881ec3dc8070e2fd392 WatchSource:0}: Error finding container 4603194a72d482edb560b0d333d29a53ea317a7e0f739881ec3dc8070e2fd392: Status 404 returned error can't find the container with id 4603194a72d482edb560b0d333d29a53ea317a7e0f739881ec3dc8070e2fd392 Oct 02 19:01:01 crc kubenswrapper[4909]: I1002 19:01:01.958278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323861-qs6x8" event={"ID":"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7","Type":"ContainerStarted","Data":"be269a0c77026620a9b260e9a6829b9d6c51a06993a2d36145cb63c47bcdb12f"} Oct 02 19:01:01 crc kubenswrapper[4909]: I1002 19:01:01.958854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323861-qs6x8" event={"ID":"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7","Type":"ContainerStarted","Data":"4603194a72d482edb560b0d333d29a53ea317a7e0f739881ec3dc8070e2fd392"} Oct 02 19:01:01 crc kubenswrapper[4909]: I1002 19:01:01.985335 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323861-qs6x8" podStartSLOduration=1.9853162439999998 podStartE2EDuration="1.985316244s" podCreationTimestamp="2025-10-02 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:01:01.975423944 +0000 UTC m=+2583.162919843" watchObservedRunningTime="2025-10-02 19:01:01.985316244 +0000 UTC m=+2583.172812093" Oct 02 19:01:05 crc kubenswrapper[4909]: I1002 19:01:05.000326 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" containerID="be269a0c77026620a9b260e9a6829b9d6c51a06993a2d36145cb63c47bcdb12f" exitCode=0 Oct 02 19:01:05 crc kubenswrapper[4909]: I1002 19:01:05.000439 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323861-qs6x8" event={"ID":"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7","Type":"ContainerDied","Data":"be269a0c77026620a9b260e9a6829b9d6c51a06993a2d36145cb63c47bcdb12f"} Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.538485 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.691721 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-fernet-keys\") pod \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.692115 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-combined-ca-bundle\") pod \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.692141 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd7fp\" (UniqueName: \"kubernetes.io/projected/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-kube-api-access-dd7fp\") pod \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.692193 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-config-data\") pod \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\" (UID: \"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7\") " Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.705175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" (UID: "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.705288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-kube-api-access-dd7fp" (OuterVolumeSpecName: "kube-api-access-dd7fp") pod "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" (UID: "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7"). InnerVolumeSpecName "kube-api-access-dd7fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.729371 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" (UID: "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.760339 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-config-data" (OuterVolumeSpecName: "config-data") pod "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" (UID: "8c8428d5-6fb6-4e18-84b2-d5f88dd993d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.794856 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.794915 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.794932 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:01:06 crc kubenswrapper[4909]: I1002 19:01:06.794948 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd7fp\" (UniqueName: \"kubernetes.io/projected/8c8428d5-6fb6-4e18-84b2-d5f88dd993d7-kube-api-access-dd7fp\") on node \"crc\" DevicePath \"\"" Oct 02 19:01:07 crc kubenswrapper[4909]: I1002 19:01:07.028141 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323861-qs6x8" event={"ID":"8c8428d5-6fb6-4e18-84b2-d5f88dd993d7","Type":"ContainerDied","Data":"4603194a72d482edb560b0d333d29a53ea317a7e0f739881ec3dc8070e2fd392"} Oct 02 19:01:07 crc kubenswrapper[4909]: I1002 19:01:07.028207 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4603194a72d482edb560b0d333d29a53ea317a7e0f739881ec3dc8070e2fd392" Oct 02 19:01:07 crc kubenswrapper[4909]: I1002 19:01:07.028740 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323861-qs6x8" Oct 02 19:01:14 crc kubenswrapper[4909]: I1002 19:01:14.608438 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:01:14 crc kubenswrapper[4909]: E1002 19:01:14.609448 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:01:29 crc kubenswrapper[4909]: I1002 19:01:29.619149 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:01:30 crc kubenswrapper[4909]: I1002 19:01:30.365812 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"fb76024eadcd8f4db5e461250577c256158613c29881d5eb2c546cfac0689fa2"} Oct 02 19:01:47 crc kubenswrapper[4909]: I1002 19:01:47.433685 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5dc88f55df-4dlwb" podUID="811ebdce-cef0-4178-836b-17bcdd164575" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.506791 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fb"] Oct 02 19:03:30 crc kubenswrapper[4909]: E1002 19:03:30.511073 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" containerName="keystone-cron" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.511318 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" containerName="keystone-cron" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.512069 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8428d5-6fb6-4e18-84b2-d5f88dd993d7" containerName="keystone-cron" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.516346 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.538456 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fb"] Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.683478 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mws8j\" (UniqueName: \"kubernetes.io/projected/8c60ae24-0772-4342-aa75-cdd8802cf424-kube-api-access-mws8j\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.683803 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-catalog-content\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.683971 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-utilities\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.786168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-utilities\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.786370 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mws8j\" (UniqueName: \"kubernetes.io/projected/8c60ae24-0772-4342-aa75-cdd8802cf424-kube-api-access-mws8j\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.786515 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-catalog-content\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.786795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-utilities\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.787035 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-catalog-content\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.812322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mws8j\" (UniqueName: \"kubernetes.io/projected/8c60ae24-0772-4342-aa75-cdd8802cf424-kube-api-access-mws8j\") pod \"redhat-marketplace-2m2fb\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:30 crc kubenswrapper[4909]: I1002 19:03:30.851466 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:31 crc kubenswrapper[4909]: I1002 19:03:31.415966 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fb"] Oct 02 19:03:31 crc kubenswrapper[4909]: I1002 19:03:31.815715 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerID="dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b" exitCode=0 Oct 02 19:03:31 crc kubenswrapper[4909]: I1002 19:03:31.815799 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fb" event={"ID":"8c60ae24-0772-4342-aa75-cdd8802cf424","Type":"ContainerDied","Data":"dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b"} Oct 02 19:03:31 crc kubenswrapper[4909]: I1002 19:03:31.816141 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fb" event={"ID":"8c60ae24-0772-4342-aa75-cdd8802cf424","Type":"ContainerStarted","Data":"e073a9812bee11ef15d1753e5daf7b892dfd78ccc87e1b87440475e3a01de3bb"} Oct 02 19:03:31 crc kubenswrapper[4909]: I1002 19:03:31.819574 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:03:33 crc kubenswrapper[4909]: I1002 19:03:33.843062 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerID="205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667" exitCode=0 Oct 02 19:03:33 crc kubenswrapper[4909]: I1002 19:03:33.843529 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fb" event={"ID":"8c60ae24-0772-4342-aa75-cdd8802cf424","Type":"ContainerDied","Data":"205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667"} Oct 02 19:03:34 crc kubenswrapper[4909]: I1002 19:03:34.856182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fb" event={"ID":"8c60ae24-0772-4342-aa75-cdd8802cf424","Type":"ContainerStarted","Data":"835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502"} Oct 02 19:03:34 crc kubenswrapper[4909]: I1002 19:03:34.882759 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2m2fb" podStartSLOduration=2.356868947 podStartE2EDuration="4.882742042s" podCreationTimestamp="2025-10-02 19:03:30 +0000 UTC" firstStartedPulling="2025-10-02 19:03:31.819186461 +0000 UTC m=+2733.006682350" lastFinishedPulling="2025-10-02 19:03:34.345059596 +0000 UTC m=+2735.532555445" observedRunningTime="2025-10-02 19:03:34.876007212 +0000 UTC m=+2736.063503071" watchObservedRunningTime="2025-10-02 19:03:34.882742042 +0000 UTC m=+2736.070237901" Oct 02 19:03:40 crc kubenswrapper[4909]: I1002 19:03:40.852621 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:40 crc kubenswrapper[4909]: I1002 19:03:40.853216 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:40 crc kubenswrapper[4909]: I1002 19:03:40.926344 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:41 crc kubenswrapper[4909]: I1002 19:03:41.029417 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:41 crc kubenswrapper[4909]: I1002 19:03:41.176151 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fb"] Oct 02 19:03:42 crc kubenswrapper[4909]: I1002 19:03:42.978781 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2m2fb" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="registry-server" containerID="cri-o://835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502" gracePeriod=2 Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.583154 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.681579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mws8j\" (UniqueName: \"kubernetes.io/projected/8c60ae24-0772-4342-aa75-cdd8802cf424-kube-api-access-mws8j\") pod \"8c60ae24-0772-4342-aa75-cdd8802cf424\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.681675 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-utilities\") pod \"8c60ae24-0772-4342-aa75-cdd8802cf424\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.681768 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-catalog-content\") pod \"8c60ae24-0772-4342-aa75-cdd8802cf424\" (UID: \"8c60ae24-0772-4342-aa75-cdd8802cf424\") " Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.682842 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-utilities" (OuterVolumeSpecName: "utilities") pod "8c60ae24-0772-4342-aa75-cdd8802cf424" (UID: "8c60ae24-0772-4342-aa75-cdd8802cf424"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.690468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c60ae24-0772-4342-aa75-cdd8802cf424-kube-api-access-mws8j" (OuterVolumeSpecName: "kube-api-access-mws8j") pod "8c60ae24-0772-4342-aa75-cdd8802cf424" (UID: "8c60ae24-0772-4342-aa75-cdd8802cf424"). InnerVolumeSpecName "kube-api-access-mws8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.700484 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c60ae24-0772-4342-aa75-cdd8802cf424" (UID: "8c60ae24-0772-4342-aa75-cdd8802cf424"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.785142 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.785584 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mws8j\" (UniqueName: \"kubernetes.io/projected/8c60ae24-0772-4342-aa75-cdd8802cf424-kube-api-access-mws8j\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.785605 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c60ae24-0772-4342-aa75-cdd8802cf424-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.990809 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerID="835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502" exitCode=0 Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.990861 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fb" event={"ID":"8c60ae24-0772-4342-aa75-cdd8802cf424","Type":"ContainerDied","Data":"835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502"} Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.990892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fb" event={"ID":"8c60ae24-0772-4342-aa75-cdd8802cf424","Type":"ContainerDied","Data":"e073a9812bee11ef15d1753e5daf7b892dfd78ccc87e1b87440475e3a01de3bb"} Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.990917 4909 scope.go:117] "RemoveContainer" containerID="835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502" Oct 02 19:03:43 crc kubenswrapper[4909]: I1002 19:03:43.991085 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fb" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.024734 4909 scope.go:117] "RemoveContainer" containerID="205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.028043 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fb"] Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.038776 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fb"] Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.046900 4909 scope.go:117] "RemoveContainer" containerID="dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.102428 4909 scope.go:117] "RemoveContainer" containerID="835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502" Oct 02 19:03:44 crc kubenswrapper[4909]: E1002 19:03:44.103005 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502\": container with ID starting with 835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502 not found: ID does not exist" containerID="835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.103054 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502"} err="failed to get container status \"835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502\": rpc error: code = NotFound desc = could not find container \"835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502\": container with ID starting with 835f71a1f3e097af5acf86620b30d0562171b013e0310334fc61c56638a96502 not found: ID does not exist" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.103077 4909 scope.go:117] "RemoveContainer" containerID="205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667" Oct 02 19:03:44 crc kubenswrapper[4909]: E1002 19:03:44.103346 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667\": container with ID starting with 205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667 not found: ID does not exist" containerID="205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.103367 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667"} err="failed to get container status \"205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667\": rpc error: code = NotFound desc = could not find container \"205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667\": container with ID starting with 205f64540fa8779786483707b76bf378584f122b789d0e32c9747941f98b2667 not found: ID does not exist" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.103415 4909 scope.go:117] "RemoveContainer" containerID="dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b" Oct 02 19:03:44 crc kubenswrapper[4909]: E1002 19:03:44.103630 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b\": container with ID starting with dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b not found: ID does not exist" containerID="dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b" Oct 02 19:03:44 crc kubenswrapper[4909]: I1002 19:03:44.103648 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b"} err="failed to get container status \"dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b\": rpc error: code = NotFound desc = could not find container \"dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b\": container with ID starting with dd19175600713f61a1d29fb6f134da6c0bc2810b32d3df1267d665295aa0864b not found: ID does not exist" Oct 02 19:03:45 crc kubenswrapper[4909]: I1002 19:03:45.626336 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" path="/var/lib/kubelet/pods/8c60ae24-0772-4342-aa75-cdd8802cf424/volumes" Oct 02 19:03:50 crc kubenswrapper[4909]: I1002 19:03:50.056783 4909 generic.go:334] "Generic (PLEG): container finished" podID="2204de38-59ac-4528-b7d1-b7ab39dcc238" containerID="6465b68ce535665bfa3c4004b1441e4aa1ba1b624ae10d066c64660c792da54c" exitCode=0 Oct 02 19:03:50 crc kubenswrapper[4909]: I1002 19:03:50.056883 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" event={"ID":"2204de38-59ac-4528-b7d1-b7ab39dcc238","Type":"ContainerDied","Data":"6465b68ce535665bfa3c4004b1441e4aa1ba1b624ae10d066c64660c792da54c"} Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.572853 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679508 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-2\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679624 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-inventory\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679647 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ssh-key\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679691 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-1\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-telemetry-combined-ca-bundle\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-0\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.679910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrfkz\" (UniqueName: \"kubernetes.io/projected/2204de38-59ac-4528-b7d1-b7ab39dcc238-kube-api-access-qrfkz\") pod \"2204de38-59ac-4528-b7d1-b7ab39dcc238\" (UID: \"2204de38-59ac-4528-b7d1-b7ab39dcc238\") " Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.686092 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.687739 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2204de38-59ac-4528-b7d1-b7ab39dcc238-kube-api-access-qrfkz" (OuterVolumeSpecName: "kube-api-access-qrfkz") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "kube-api-access-qrfkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.713089 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.713579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.713995 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.722893 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.724270 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-inventory" (OuterVolumeSpecName: "inventory") pod "2204de38-59ac-4528-b7d1-b7ab39dcc238" (UID: "2204de38-59ac-4528-b7d1-b7ab39dcc238"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782855 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782883 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782894 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782906 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrfkz\" (UniqueName: \"kubernetes.io/projected/2204de38-59ac-4528-b7d1-b7ab39dcc238-kube-api-access-qrfkz\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782915 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782925 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:51 crc kubenswrapper[4909]: I1002 19:03:51.782934 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2204de38-59ac-4528-b7d1-b7ab39dcc238-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.095930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" event={"ID":"2204de38-59ac-4528-b7d1-b7ab39dcc238","Type":"ContainerDied","Data":"6aca5fbd6a801333f4fa6bb8c915bdad454825186968e3de4aee7a53ffe3cd82"} Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.095988 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aca5fbd6a801333f4fa6bb8c915bdad454825186968e3de4aee7a53ffe3cd82" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.096004 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.195965 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9"] Oct 02 19:03:52 crc kubenswrapper[4909]: E1002 19:03:52.196591 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="extract-content" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.196655 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="extract-content" Oct 02 19:03:52 crc kubenswrapper[4909]: E1002 19:03:52.196712 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="extract-utilities" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.196761 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="extract-utilities" Oct 02 19:03:52 crc kubenswrapper[4909]: E1002 19:03:52.196820 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="registry-server" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.196869 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="registry-server" Oct 02 19:03:52 crc kubenswrapper[4909]: E1002 19:03:52.196930 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204de38-59ac-4528-b7d1-b7ab39dcc238" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.196979 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204de38-59ac-4528-b7d1-b7ab39dcc238" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.201350 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204de38-59ac-4528-b7d1-b7ab39dcc238" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.201488 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c60ae24-0772-4342-aa75-cdd8802cf424" containerName="registry-server" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.202359 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.204508 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.207526 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.207955 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.208339 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.208787 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.217680 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9"] Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.293435 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.293675 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5jx8\" (UniqueName: \"kubernetes.io/projected/80e783af-bb38-4bf8-868c-ed135e64d57f-kube-api-access-b5jx8\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.293794 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.293876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.293949 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.294043 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.294160 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.396827 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.397469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.397683 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.398485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.399360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.399718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.400133 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5jx8\" (UniqueName: \"kubernetes.io/projected/80e783af-bb38-4bf8-868c-ed135e64d57f-kube-api-access-b5jx8\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.403506 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.404834 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.405324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.405552 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.407579 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.409296 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.418376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5jx8\" (UniqueName: \"kubernetes.io/projected/80e783af-bb38-4bf8-868c-ed135e64d57f-kube-api-access-b5jx8\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:52 crc kubenswrapper[4909]: I1002 19:03:52.521322 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:03:53 crc kubenswrapper[4909]: I1002 19:03:53.054455 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:03:53 crc kubenswrapper[4909]: I1002 19:03:53.054802 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:03:53 crc kubenswrapper[4909]: I1002 19:03:53.088230 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9"] Oct 02 19:03:54 crc kubenswrapper[4909]: I1002 19:03:54.132307 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" event={"ID":"80e783af-bb38-4bf8-868c-ed135e64d57f","Type":"ContainerStarted","Data":"46292d3263fbb72b8552d98183673725927473541db4d877d28b097f1491d4ca"} Oct 02 19:03:54 crc kubenswrapper[4909]: I1002 19:03:54.132840 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" event={"ID":"80e783af-bb38-4bf8-868c-ed135e64d57f","Type":"ContainerStarted","Data":"3184209bc52c87c4d2f66f9a0b9303802aefeaade4fdf05dac7ef46a7a871e34"} Oct 02 19:03:54 crc kubenswrapper[4909]: I1002 19:03:54.167018 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" podStartSLOduration=1.645077236 podStartE2EDuration="2.16699974s" podCreationTimestamp="2025-10-02 19:03:52 +0000 UTC" firstStartedPulling="2025-10-02 19:03:53.099698339 +0000 UTC m=+2754.287194208" lastFinishedPulling="2025-10-02 19:03:53.621620823 +0000 UTC m=+2754.809116712" observedRunningTime="2025-10-02 19:03:54.158398561 +0000 UTC m=+2755.345894420" watchObservedRunningTime="2025-10-02 19:03:54.16699974 +0000 UTC m=+2755.354495599" Oct 02 19:04:23 crc kubenswrapper[4909]: I1002 19:04:23.054662 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:04:23 crc kubenswrapper[4909]: I1002 19:04:23.055347 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.055106 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.055736 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.055816 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.056964 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb76024eadcd8f4db5e461250577c256158613c29881d5eb2c546cfac0689fa2"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.057118 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://fb76024eadcd8f4db5e461250577c256158613c29881d5eb2c546cfac0689fa2" gracePeriod=600 Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.841944 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="fb76024eadcd8f4db5e461250577c256158613c29881d5eb2c546cfac0689fa2" exitCode=0 Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.842056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"fb76024eadcd8f4db5e461250577c256158613c29881d5eb2c546cfac0689fa2"} Oct 02 19:04:53 crc kubenswrapper[4909]: I1002 19:04:53.842426 4909 scope.go:117] "RemoveContainer" containerID="153a8fa6a2c713d25b40a7e522b985ac645abfeb19c36c7cb934f9bd6f489172" Oct 02 19:04:54 crc kubenswrapper[4909]: I1002 19:04:54.856425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f"} Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.225120 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gxcmv"] Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.231379 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.240063 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxcmv"] Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.324731 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cn2w\" (UniqueName: \"kubernetes.io/projected/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-kube-api-access-9cn2w\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.324799 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-utilities\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.324824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-catalog-content\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.427003 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-utilities\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.427099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-catalog-content\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.427379 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cn2w\" (UniqueName: \"kubernetes.io/projected/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-kube-api-access-9cn2w\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.427430 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-utilities\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.427952 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-catalog-content\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.455883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cn2w\" (UniqueName: \"kubernetes.io/projected/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-kube-api-access-9cn2w\") pod \"certified-operators-gxcmv\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:02 crc kubenswrapper[4909]: I1002 19:06:02.559727 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:03 crc kubenswrapper[4909]: W1002 19:06:03.084109 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb178c2b_87c8_4fce_bc4f_b2900e4322e3.slice/crio-c03b1419985306597dd38e299b59d36d8cddeddeeaeba94186467a0f0c287243 WatchSource:0}: Error finding container c03b1419985306597dd38e299b59d36d8cddeddeeaeba94186467a0f0c287243: Status 404 returned error can't find the container with id c03b1419985306597dd38e299b59d36d8cddeddeeaeba94186467a0f0c287243 Oct 02 19:06:03 crc kubenswrapper[4909]: I1002 19:06:03.088751 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxcmv"] Oct 02 19:06:03 crc kubenswrapper[4909]: I1002 19:06:03.719272 4909 generic.go:334] "Generic (PLEG): container finished" podID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerID="bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed" exitCode=0 Oct 02 19:06:03 crc kubenswrapper[4909]: I1002 19:06:03.719385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerDied","Data":"bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed"} Oct 02 19:06:03 crc kubenswrapper[4909]: I1002 19:06:03.719682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerStarted","Data":"c03b1419985306597dd38e299b59d36d8cddeddeeaeba94186467a0f0c287243"} Oct 02 19:06:05 crc kubenswrapper[4909]: I1002 19:06:05.744004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerStarted","Data":"17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f"} Oct 02 19:06:06 crc kubenswrapper[4909]: I1002 19:06:06.757528 4909 generic.go:334] "Generic (PLEG): container finished" podID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerID="17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f" exitCode=0 Oct 02 19:06:06 crc kubenswrapper[4909]: I1002 19:06:06.757633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerDied","Data":"17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f"} Oct 02 19:06:07 crc kubenswrapper[4909]: I1002 19:06:07.769584 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerStarted","Data":"946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92"} Oct 02 19:06:07 crc kubenswrapper[4909]: I1002 19:06:07.802240 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gxcmv" podStartSLOduration=2.325364021 podStartE2EDuration="5.802213887s" podCreationTimestamp="2025-10-02 19:06:02 +0000 UTC" firstStartedPulling="2025-10-02 19:06:03.721833285 +0000 UTC m=+2884.909329154" lastFinishedPulling="2025-10-02 19:06:07.198683161 +0000 UTC m=+2888.386179020" observedRunningTime="2025-10-02 19:06:07.792610507 +0000 UTC m=+2888.980106376" watchObservedRunningTime="2025-10-02 19:06:07.802213887 +0000 UTC m=+2888.989709756" Oct 02 19:06:12 crc kubenswrapper[4909]: I1002 19:06:12.560941 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:12 crc kubenswrapper[4909]: I1002 19:06:12.561619 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:12 crc kubenswrapper[4909]: I1002 19:06:12.621770 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:12 crc kubenswrapper[4909]: I1002 19:06:12.907112 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:12 crc kubenswrapper[4909]: I1002 19:06:12.982022 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gxcmv"] Oct 02 19:06:14 crc kubenswrapper[4909]: I1002 19:06:14.848199 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gxcmv" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="registry-server" containerID="cri-o://946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92" gracePeriod=2 Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.365825 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.539443 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cn2w\" (UniqueName: \"kubernetes.io/projected/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-kube-api-access-9cn2w\") pod \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.539698 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-utilities\") pod \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.539880 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-catalog-content\") pod \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\" (UID: \"fb178c2b-87c8-4fce-bc4f-b2900e4322e3\") " Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.540524 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-utilities" (OuterVolumeSpecName: "utilities") pod "fb178c2b-87c8-4fce-bc4f-b2900e4322e3" (UID: "fb178c2b-87c8-4fce-bc4f-b2900e4322e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.545162 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.552246 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-kube-api-access-9cn2w" (OuterVolumeSpecName: "kube-api-access-9cn2w") pod "fb178c2b-87c8-4fce-bc4f-b2900e4322e3" (UID: "fb178c2b-87c8-4fce-bc4f-b2900e4322e3"). InnerVolumeSpecName "kube-api-access-9cn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.614189 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb178c2b-87c8-4fce-bc4f-b2900e4322e3" (UID: "fb178c2b-87c8-4fce-bc4f-b2900e4322e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.646902 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.647082 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cn2w\" (UniqueName: \"kubernetes.io/projected/fb178c2b-87c8-4fce-bc4f-b2900e4322e3-kube-api-access-9cn2w\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.860654 4909 generic.go:334] "Generic (PLEG): container finished" podID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerID="946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92" exitCode=0 Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.860706 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerDied","Data":"946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92"} Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.860772 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxcmv" event={"ID":"fb178c2b-87c8-4fce-bc4f-b2900e4322e3","Type":"ContainerDied","Data":"c03b1419985306597dd38e299b59d36d8cddeddeeaeba94186467a0f0c287243"} Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.860785 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxcmv" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.860804 4909 scope.go:117] "RemoveContainer" containerID="946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.889197 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gxcmv"] Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.892604 4909 scope.go:117] "RemoveContainer" containerID="17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.899106 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gxcmv"] Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.921828 4909 scope.go:117] "RemoveContainer" containerID="bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.987949 4909 scope.go:117] "RemoveContainer" containerID="946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92" Oct 02 19:06:15 crc kubenswrapper[4909]: E1002 19:06:15.988661 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92\": container with ID starting with 946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92 not found: ID does not exist" containerID="946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.988746 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92"} err="failed to get container status \"946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92\": rpc error: code = NotFound desc = could not find container \"946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92\": container with ID starting with 946d4ae8c219c201751776700aff33af99acfa7ae139384c0225a3e7ddf9de92 not found: ID does not exist" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.988816 4909 scope.go:117] "RemoveContainer" containerID="17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f" Oct 02 19:06:15 crc kubenswrapper[4909]: E1002 19:06:15.989332 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f\": container with ID starting with 17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f not found: ID does not exist" containerID="17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.989372 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f"} err="failed to get container status \"17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f\": rpc error: code = NotFound desc = could not find container \"17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f\": container with ID starting with 17b4657c444d0630ab1e419df30dd844aa84390dfd4ca1fac6b017ab9729db0f not found: ID does not exist" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.989402 4909 scope.go:117] "RemoveContainer" containerID="bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed" Oct 02 19:06:15 crc kubenswrapper[4909]: E1002 19:06:15.989674 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed\": container with ID starting with bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed not found: ID does not exist" containerID="bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed" Oct 02 19:06:15 crc kubenswrapper[4909]: I1002 19:06:15.989764 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed"} err="failed to get container status \"bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed\": rpc error: code = NotFound desc = could not find container \"bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed\": container with ID starting with bc4c73ecf879eb2db97331ed61008746e5cee7902006fc4d9be4a89d40da6aed not found: ID does not exist" Oct 02 19:06:17 crc kubenswrapper[4909]: I1002 19:06:17.619723 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" path="/var/lib/kubelet/pods/fb178c2b-87c8-4fce-bc4f-b2900e4322e3/volumes" Oct 02 19:06:20 crc kubenswrapper[4909]: I1002 19:06:20.928785 4909 generic.go:334] "Generic (PLEG): container finished" podID="80e783af-bb38-4bf8-868c-ed135e64d57f" containerID="46292d3263fbb72b8552d98183673725927473541db4d877d28b097f1491d4ca" exitCode=0 Oct 02 19:06:20 crc kubenswrapper[4909]: I1002 19:06:20.929207 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" event={"ID":"80e783af-bb38-4bf8-868c-ed135e64d57f","Type":"ContainerDied","Data":"46292d3263fbb72b8552d98183673725927473541db4d877d28b097f1491d4ca"} Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.462219 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.603924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ssh-key\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.604002 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-0\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.604122 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-1\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.604189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-inventory\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.604247 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5jx8\" (UniqueName: \"kubernetes.io/projected/80e783af-bb38-4bf8-868c-ed135e64d57f-kube-api-access-b5jx8\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.604277 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-2\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.604337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-telemetry-power-monitoring-combined-ca-bundle\") pod \"80e783af-bb38-4bf8-868c-ed135e64d57f\" (UID: \"80e783af-bb38-4bf8-868c-ed135e64d57f\") " Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.612362 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e783af-bb38-4bf8-868c-ed135e64d57f-kube-api-access-b5jx8" (OuterVolumeSpecName: "kube-api-access-b5jx8") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "kube-api-access-b5jx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.613353 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.641760 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.645380 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.666720 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-inventory" (OuterVolumeSpecName: "inventory") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.667341 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.671481 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "80e783af-bb38-4bf8-868c-ed135e64d57f" (UID: "80e783af-bb38-4bf8-868c-ed135e64d57f"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715000 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715048 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715061 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715072 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715081 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5jx8\" (UniqueName: \"kubernetes.io/projected/80e783af-bb38-4bf8-868c-ed135e64d57f-kube-api-access-b5jx8\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715090 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.715099 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e783af-bb38-4bf8-868c-ed135e64d57f-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.953426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" event={"ID":"80e783af-bb38-4bf8-868c-ed135e64d57f","Type":"ContainerDied","Data":"3184209bc52c87c4d2f66f9a0b9303802aefeaade4fdf05dac7ef46a7a871e34"} Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.953755 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3184209bc52c87c4d2f66f9a0b9303802aefeaade4fdf05dac7ef46a7a871e34" Oct 02 19:06:22 crc kubenswrapper[4909]: I1002 19:06:22.953705 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.104411 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk"] Oct 02 19:06:23 crc kubenswrapper[4909]: E1002 19:06:23.105247 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="extract-content" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.105343 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="extract-content" Oct 02 19:06:23 crc kubenswrapper[4909]: E1002 19:06:23.105435 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e783af-bb38-4bf8-868c-ed135e64d57f" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.105529 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e783af-bb38-4bf8-868c-ed135e64d57f" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 02 19:06:23 crc kubenswrapper[4909]: E1002 19:06:23.105598 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="extract-utilities" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.105661 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="extract-utilities" Oct 02 19:06:23 crc kubenswrapper[4909]: E1002 19:06:23.105749 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="registry-server" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.105824 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="registry-server" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.106268 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e783af-bb38-4bf8-868c-ed135e64d57f" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.106409 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb178c2b-87c8-4fce-bc4f-b2900e4322e3" containerName="registry-server" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.107416 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.109667 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.110102 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.110160 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.110335 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.110523 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.113923 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk"] Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.224215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.224380 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.224517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkhdc\" (UniqueName: \"kubernetes.io/projected/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-kube-api-access-zkhdc\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.224547 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.224618 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.327075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.327252 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkhdc\" (UniqueName: \"kubernetes.io/projected/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-kube-api-access-zkhdc\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.327288 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.327343 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.327380 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.331417 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.331518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.331595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.333560 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.349548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkhdc\" (UniqueName: \"kubernetes.io/projected/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-kube-api-access-zkhdc\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zxtgk\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.430821 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:23 crc kubenswrapper[4909]: I1002 19:06:23.985627 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk"] Oct 02 19:06:24 crc kubenswrapper[4909]: I1002 19:06:24.979956 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" event={"ID":"ab9ee6fa-d742-451b-9c9d-f534cc2bae29","Type":"ContainerStarted","Data":"881181012c7824ed02063dec31ff69f0aec758bf35bdbc7627e34baeb1daedd1"} Oct 02 19:06:24 crc kubenswrapper[4909]: I1002 19:06:24.980494 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" event={"ID":"ab9ee6fa-d742-451b-9c9d-f534cc2bae29","Type":"ContainerStarted","Data":"6c1a028be764d1983616852f4d5d426d9cb4676674c25f4c541cd368934b067f"} Oct 02 19:06:25 crc kubenswrapper[4909]: I1002 19:06:25.012680 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" podStartSLOduration=1.427505634 podStartE2EDuration="2.012652956s" podCreationTimestamp="2025-10-02 19:06:23 +0000 UTC" firstStartedPulling="2025-10-02 19:06:23.998395585 +0000 UTC m=+2905.185891444" lastFinishedPulling="2025-10-02 19:06:24.583542897 +0000 UTC m=+2905.771038766" observedRunningTime="2025-10-02 19:06:24.999091852 +0000 UTC m=+2906.186587751" watchObservedRunningTime="2025-10-02 19:06:25.012652956 +0000 UTC m=+2906.200148855" Oct 02 19:06:47 crc kubenswrapper[4909]: I1002 19:06:47.285434 4909 generic.go:334] "Generic (PLEG): container finished" podID="ab9ee6fa-d742-451b-9c9d-f534cc2bae29" containerID="881181012c7824ed02063dec31ff69f0aec758bf35bdbc7627e34baeb1daedd1" exitCode=0 Oct 02 19:06:47 crc kubenswrapper[4909]: I1002 19:06:47.285506 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" event={"ID":"ab9ee6fa-d742-451b-9c9d-f534cc2bae29","Type":"ContainerDied","Data":"881181012c7824ed02063dec31ff69f0aec758bf35bdbc7627e34baeb1daedd1"} Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.798172 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.908226 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-1\") pod \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.908585 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-0\") pod \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.908682 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkhdc\" (UniqueName: \"kubernetes.io/projected/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-kube-api-access-zkhdc\") pod \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.908913 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-ssh-key\") pod \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.908946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-inventory\") pod \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\" (UID: \"ab9ee6fa-d742-451b-9c9d-f534cc2bae29\") " Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.915210 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-kube-api-access-zkhdc" (OuterVolumeSpecName: "kube-api-access-zkhdc") pod "ab9ee6fa-d742-451b-9c9d-f534cc2bae29" (UID: "ab9ee6fa-d742-451b-9c9d-f534cc2bae29"). InnerVolumeSpecName "kube-api-access-zkhdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.942665 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ab9ee6fa-d742-451b-9c9d-f534cc2bae29" (UID: "ab9ee6fa-d742-451b-9c9d-f534cc2bae29"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.952102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "ab9ee6fa-d742-451b-9c9d-f534cc2bae29" (UID: "ab9ee6fa-d742-451b-9c9d-f534cc2bae29"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.955950 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-inventory" (OuterVolumeSpecName: "inventory") pod "ab9ee6fa-d742-451b-9c9d-f534cc2bae29" (UID: "ab9ee6fa-d742-451b-9c9d-f534cc2bae29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:48 crc kubenswrapper[4909]: I1002 19:06:48.963704 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "ab9ee6fa-d742-451b-9c9d-f534cc2bae29" (UID: "ab9ee6fa-d742-451b-9c9d-f534cc2bae29"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.015104 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkhdc\" (UniqueName: \"kubernetes.io/projected/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-kube-api-access-zkhdc\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.015134 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.015144 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.015153 4909 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.015163 4909 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab9ee6fa-d742-451b-9c9d-f534cc2bae29-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.308107 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" event={"ID":"ab9ee6fa-d742-451b-9c9d-f534cc2bae29","Type":"ContainerDied","Data":"6c1a028be764d1983616852f4d5d426d9cb4676674c25f4c541cd368934b067f"} Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.308488 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c1a028be764d1983616852f4d5d426d9cb4676674c25f4c541cd368934b067f" Oct 02 19:06:49 crc kubenswrapper[4909]: I1002 19:06:49.308201 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk" Oct 02 19:07:23 crc kubenswrapper[4909]: I1002 19:07:23.054785 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:07:23 crc kubenswrapper[4909]: I1002 19:07:23.055480 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:07:53 crc kubenswrapper[4909]: I1002 19:07:53.054506 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:07:53 crc kubenswrapper[4909]: I1002 19:07:53.055185 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.054482 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.054918 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.054961 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.055521 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.055574 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" gracePeriod=600 Oct 02 19:08:23 crc kubenswrapper[4909]: E1002 19:08:23.195458 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.459451 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" exitCode=0 Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.459538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f"} Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.459791 4909 scope.go:117] "RemoveContainer" containerID="fb76024eadcd8f4db5e461250577c256158613c29881d5eb2c546cfac0689fa2" Oct 02 19:08:23 crc kubenswrapper[4909]: I1002 19:08:23.460241 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:08:23 crc kubenswrapper[4909]: E1002 19:08:23.460525 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:08:37 crc kubenswrapper[4909]: I1002 19:08:37.609168 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:08:37 crc kubenswrapper[4909]: E1002 19:08:37.610583 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:08:37 crc kubenswrapper[4909]: I1002 19:08:37.898584 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xh4fq"] Oct 02 19:08:37 crc kubenswrapper[4909]: E1002 19:08:37.899757 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9ee6fa-d742-451b-9c9d-f534cc2bae29" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 02 19:08:37 crc kubenswrapper[4909]: I1002 19:08:37.899901 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9ee6fa-d742-451b-9c9d-f534cc2bae29" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 02 19:08:37 crc kubenswrapper[4909]: I1002 19:08:37.900469 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9ee6fa-d742-451b-9c9d-f534cc2bae29" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 02 19:08:37 crc kubenswrapper[4909]: I1002 19:08:37.903177 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:37 crc kubenswrapper[4909]: I1002 19:08:37.915291 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xh4fq"] Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.092387 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-catalog-content\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.092450 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x89m\" (UniqueName: \"kubernetes.io/projected/a0f15315-79d7-4101-97db-20d9e782a42a-kube-api-access-8x89m\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.092677 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-utilities\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.194146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-utilities\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.194300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-catalog-content\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.194332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x89m\" (UniqueName: \"kubernetes.io/projected/a0f15315-79d7-4101-97db-20d9e782a42a-kube-api-access-8x89m\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.194785 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-catalog-content\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.194785 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-utilities\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.219883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x89m\" (UniqueName: \"kubernetes.io/projected/a0f15315-79d7-4101-97db-20d9e782a42a-kube-api-access-8x89m\") pod \"redhat-operators-xh4fq\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.233745 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:38 crc kubenswrapper[4909]: I1002 19:08:38.758693 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xh4fq"] Oct 02 19:08:39 crc kubenswrapper[4909]: I1002 19:08:39.715141 4909 generic.go:334] "Generic (PLEG): container finished" podID="a0f15315-79d7-4101-97db-20d9e782a42a" containerID="d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d" exitCode=0 Oct 02 19:08:39 crc kubenswrapper[4909]: I1002 19:08:39.715456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerDied","Data":"d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d"} Oct 02 19:08:39 crc kubenswrapper[4909]: I1002 19:08:39.715487 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerStarted","Data":"96cef7c302f9a627dfb033c9663a66763d3492caca245bcde2d5a13ccebdd62e"} Oct 02 19:08:39 crc kubenswrapper[4909]: I1002 19:08:39.717771 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:08:41 crc kubenswrapper[4909]: I1002 19:08:41.737602 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerStarted","Data":"7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2"} Oct 02 19:08:42 crc kubenswrapper[4909]: E1002 19:08:42.615918 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f15315_79d7_4101_97db_20d9e782a42a.slice/crio-conmon-7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2.scope\": RecentStats: unable to find data in memory cache]" Oct 02 19:08:42 crc kubenswrapper[4909]: I1002 19:08:42.750749 4909 generic.go:334] "Generic (PLEG): container finished" podID="a0f15315-79d7-4101-97db-20d9e782a42a" containerID="7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2" exitCode=0 Oct 02 19:08:42 crc kubenswrapper[4909]: I1002 19:08:42.750818 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerDied","Data":"7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2"} Oct 02 19:08:46 crc kubenswrapper[4909]: I1002 19:08:46.792633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerStarted","Data":"187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395"} Oct 02 19:08:46 crc kubenswrapper[4909]: I1002 19:08:46.821155 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xh4fq" podStartSLOduration=3.856355297 podStartE2EDuration="9.821136581s" podCreationTimestamp="2025-10-02 19:08:37 +0000 UTC" firstStartedPulling="2025-10-02 19:08:39.717207909 +0000 UTC m=+3040.904703808" lastFinishedPulling="2025-10-02 19:08:45.681989203 +0000 UTC m=+3046.869485092" observedRunningTime="2025-10-02 19:08:46.813690508 +0000 UTC m=+3048.001186377" watchObservedRunningTime="2025-10-02 19:08:46.821136581 +0000 UTC m=+3048.008632430" Oct 02 19:08:48 crc kubenswrapper[4909]: I1002 19:08:48.234215 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:48 crc kubenswrapper[4909]: I1002 19:08:48.234630 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:49 crc kubenswrapper[4909]: I1002 19:08:49.300614 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xh4fq" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="registry-server" probeResult="failure" output=< Oct 02 19:08:49 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:08:49 crc kubenswrapper[4909]: > Oct 02 19:08:52 crc kubenswrapper[4909]: I1002 19:08:52.615680 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:08:52 crc kubenswrapper[4909]: E1002 19:08:52.616801 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:08:58 crc kubenswrapper[4909]: I1002 19:08:58.306671 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:58 crc kubenswrapper[4909]: I1002 19:08:58.363872 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:08:58 crc kubenswrapper[4909]: I1002 19:08:58.553055 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xh4fq"] Oct 02 19:08:59 crc kubenswrapper[4909]: I1002 19:08:59.951903 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xh4fq" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="registry-server" containerID="cri-o://187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395" gracePeriod=2 Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.504323 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.681880 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-utilities\") pod \"a0f15315-79d7-4101-97db-20d9e782a42a\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.682118 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x89m\" (UniqueName: \"kubernetes.io/projected/a0f15315-79d7-4101-97db-20d9e782a42a-kube-api-access-8x89m\") pod \"a0f15315-79d7-4101-97db-20d9e782a42a\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.682179 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-catalog-content\") pod \"a0f15315-79d7-4101-97db-20d9e782a42a\" (UID: \"a0f15315-79d7-4101-97db-20d9e782a42a\") " Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.683642 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-utilities" (OuterVolumeSpecName: "utilities") pod "a0f15315-79d7-4101-97db-20d9e782a42a" (UID: "a0f15315-79d7-4101-97db-20d9e782a42a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.690311 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f15315-79d7-4101-97db-20d9e782a42a-kube-api-access-8x89m" (OuterVolumeSpecName: "kube-api-access-8x89m") pod "a0f15315-79d7-4101-97db-20d9e782a42a" (UID: "a0f15315-79d7-4101-97db-20d9e782a42a"). InnerVolumeSpecName "kube-api-access-8x89m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.761456 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0f15315-79d7-4101-97db-20d9e782a42a" (UID: "a0f15315-79d7-4101-97db-20d9e782a42a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.784955 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x89m\" (UniqueName: \"kubernetes.io/projected/a0f15315-79d7-4101-97db-20d9e782a42a-kube-api-access-8x89m\") on node \"crc\" DevicePath \"\"" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.784999 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.785018 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f15315-79d7-4101-97db-20d9e782a42a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.964124 4909 generic.go:334] "Generic (PLEG): container finished" podID="a0f15315-79d7-4101-97db-20d9e782a42a" containerID="187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395" exitCode=0 Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.964182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerDied","Data":"187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395"} Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.964189 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh4fq" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.964224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh4fq" event={"ID":"a0f15315-79d7-4101-97db-20d9e782a42a","Type":"ContainerDied","Data":"96cef7c302f9a627dfb033c9663a66763d3492caca245bcde2d5a13ccebdd62e"} Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.964246 4909 scope.go:117] "RemoveContainer" containerID="187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395" Oct 02 19:09:00 crc kubenswrapper[4909]: I1002 19:09:00.991309 4909 scope.go:117] "RemoveContainer" containerID="7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.003128 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xh4fq"] Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.014067 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xh4fq"] Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.022957 4909 scope.go:117] "RemoveContainer" containerID="d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.078297 4909 scope.go:117] "RemoveContainer" containerID="187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395" Oct 02 19:09:01 crc kubenswrapper[4909]: E1002 19:09:01.078690 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395\": container with ID starting with 187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395 not found: ID does not exist" containerID="187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.078751 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395"} err="failed to get container status \"187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395\": rpc error: code = NotFound desc = could not find container \"187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395\": container with ID starting with 187e83b5d3d5b5c9adb849d1277dc2e24c6aeea127e38e383b3c3b09f26c6395 not found: ID does not exist" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.078784 4909 scope.go:117] "RemoveContainer" containerID="7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2" Oct 02 19:09:01 crc kubenswrapper[4909]: E1002 19:09:01.079132 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2\": container with ID starting with 7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2 not found: ID does not exist" containerID="7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.079191 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2"} err="failed to get container status \"7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2\": rpc error: code = NotFound desc = could not find container \"7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2\": container with ID starting with 7218be5588379d9a71b8320d0e456b4f4b7b479471efb13177e9deaa215efaa2 not found: ID does not exist" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.079222 4909 scope.go:117] "RemoveContainer" containerID="d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d" Oct 02 19:09:01 crc kubenswrapper[4909]: E1002 19:09:01.079524 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d\": container with ID starting with d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d not found: ID does not exist" containerID="d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.079562 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d"} err="failed to get container status \"d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d\": rpc error: code = NotFound desc = could not find container \"d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d\": container with ID starting with d217365b03d479a8489d9d8ef99e162085d8504a215039bcb8d4aff4b6bc466d not found: ID does not exist" Oct 02 19:09:01 crc kubenswrapper[4909]: I1002 19:09:01.625333 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" path="/var/lib/kubelet/pods/a0f15315-79d7-4101-97db-20d9e782a42a/volumes" Oct 02 19:09:04 crc kubenswrapper[4909]: I1002 19:09:04.608914 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:09:04 crc kubenswrapper[4909]: E1002 19:09:04.609752 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:09:16 crc kubenswrapper[4909]: I1002 19:09:16.609087 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:09:16 crc kubenswrapper[4909]: E1002 19:09:16.610164 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:09:29 crc kubenswrapper[4909]: I1002 19:09:29.617087 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:09:29 crc kubenswrapper[4909]: E1002 19:09:29.617958 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:09:40 crc kubenswrapper[4909]: I1002 19:09:40.609257 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:09:40 crc kubenswrapper[4909]: E1002 19:09:40.610487 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:09:53 crc kubenswrapper[4909]: I1002 19:09:53.609097 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:09:53 crc kubenswrapper[4909]: E1002 19:09:53.610328 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:10:07 crc kubenswrapper[4909]: I1002 19:10:07.608796 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:10:07 crc kubenswrapper[4909]: E1002 19:10:07.609916 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:10:19 crc kubenswrapper[4909]: I1002 19:10:19.621945 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:10:19 crc kubenswrapper[4909]: E1002 19:10:19.623743 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:10:34 crc kubenswrapper[4909]: I1002 19:10:34.608544 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:10:34 crc kubenswrapper[4909]: E1002 19:10:34.609468 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:10:49 crc kubenswrapper[4909]: I1002 19:10:49.621503 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:10:49 crc kubenswrapper[4909]: E1002 19:10:49.624224 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:11:04 crc kubenswrapper[4909]: I1002 19:11:04.609409 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:11:04 crc kubenswrapper[4909]: E1002 19:11:04.610458 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:11:15 crc kubenswrapper[4909]: I1002 19:11:15.610204 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:11:15 crc kubenswrapper[4909]: E1002 19:11:15.611357 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:11:28 crc kubenswrapper[4909]: I1002 19:11:28.609865 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:11:28 crc kubenswrapper[4909]: E1002 19:11:28.611247 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:11:41 crc kubenswrapper[4909]: I1002 19:11:41.608317 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:11:41 crc kubenswrapper[4909]: E1002 19:11:41.609398 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.505740 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7tss9"] Oct 02 19:11:47 crc kubenswrapper[4909]: E1002 19:11:47.511249 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="extract-content" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.511292 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="extract-content" Oct 02 19:11:47 crc kubenswrapper[4909]: E1002 19:11:47.511332 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="extract-utilities" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.511346 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="extract-utilities" Oct 02 19:11:47 crc kubenswrapper[4909]: E1002 19:11:47.511417 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="registry-server" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.511431 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="registry-server" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.511905 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f15315-79d7-4101-97db-20d9e782a42a" containerName="registry-server" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.515507 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.565349 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7tss9"] Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.647152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-utilities\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.647289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrqxc\" (UniqueName: \"kubernetes.io/projected/5c9f0aa2-117e-43e2-aa0d-e296763577bd-kube-api-access-jrqxc\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.647418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-catalog-content\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.748988 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-utilities\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.749306 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrqxc\" (UniqueName: \"kubernetes.io/projected/5c9f0aa2-117e-43e2-aa0d-e296763577bd-kube-api-access-jrqxc\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.749378 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-utilities\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.749570 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-catalog-content\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.749990 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-catalog-content\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.774266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrqxc\" (UniqueName: \"kubernetes.io/projected/5c9f0aa2-117e-43e2-aa0d-e296763577bd-kube-api-access-jrqxc\") pod \"community-operators-7tss9\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:47 crc kubenswrapper[4909]: I1002 19:11:47.837315 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:48 crc kubenswrapper[4909]: I1002 19:11:48.415520 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7tss9"] Oct 02 19:11:48 crc kubenswrapper[4909]: W1002 19:11:48.415958 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9f0aa2_117e_43e2_aa0d_e296763577bd.slice/crio-e51cf1550819046bb53d74de920aa4087ba4a734418701fb976bdc927e5b85b6 WatchSource:0}: Error finding container e51cf1550819046bb53d74de920aa4087ba4a734418701fb976bdc927e5b85b6: Status 404 returned error can't find the container with id e51cf1550819046bb53d74de920aa4087ba4a734418701fb976bdc927e5b85b6 Oct 02 19:11:48 crc kubenswrapper[4909]: I1002 19:11:48.996864 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerID="2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b" exitCode=0 Oct 02 19:11:48 crc kubenswrapper[4909]: I1002 19:11:48.996919 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerDied","Data":"2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b"} Oct 02 19:11:48 crc kubenswrapper[4909]: I1002 19:11:48.996983 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerStarted","Data":"e51cf1550819046bb53d74de920aa4087ba4a734418701fb976bdc927e5b85b6"} Oct 02 19:11:51 crc kubenswrapper[4909]: I1002 19:11:51.042457 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerStarted","Data":"2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b"} Oct 02 19:11:52 crc kubenswrapper[4909]: I1002 19:11:52.058380 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerID="2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b" exitCode=0 Oct 02 19:11:52 crc kubenswrapper[4909]: I1002 19:11:52.058444 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerDied","Data":"2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b"} Oct 02 19:11:53 crc kubenswrapper[4909]: I1002 19:11:53.102702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerStarted","Data":"0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205"} Oct 02 19:11:53 crc kubenswrapper[4909]: I1002 19:11:53.139666 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7tss9" podStartSLOduration=2.559109539 podStartE2EDuration="6.139639932s" podCreationTimestamp="2025-10-02 19:11:47 +0000 UTC" firstStartedPulling="2025-10-02 19:11:48.999751596 +0000 UTC m=+3230.187247495" lastFinishedPulling="2025-10-02 19:11:52.580281989 +0000 UTC m=+3233.767777888" observedRunningTime="2025-10-02 19:11:53.129694783 +0000 UTC m=+3234.317190672" watchObservedRunningTime="2025-10-02 19:11:53.139639932 +0000 UTC m=+3234.327135791" Oct 02 19:11:56 crc kubenswrapper[4909]: I1002 19:11:56.608965 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:11:56 crc kubenswrapper[4909]: E1002 19:11:56.609584 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:11:57 crc kubenswrapper[4909]: I1002 19:11:57.837569 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:57 crc kubenswrapper[4909]: I1002 19:11:57.839370 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:57 crc kubenswrapper[4909]: I1002 19:11:57.914386 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:58 crc kubenswrapper[4909]: I1002 19:11:58.243593 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:11:58 crc kubenswrapper[4909]: I1002 19:11:58.305808 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7tss9"] Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.181614 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7tss9" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="registry-server" containerID="cri-o://0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205" gracePeriod=2 Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.722043 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.757347 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrqxc\" (UniqueName: \"kubernetes.io/projected/5c9f0aa2-117e-43e2-aa0d-e296763577bd-kube-api-access-jrqxc\") pod \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.757461 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-utilities\") pod \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.757574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-catalog-content\") pod \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\" (UID: \"5c9f0aa2-117e-43e2-aa0d-e296763577bd\") " Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.765923 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-utilities" (OuterVolumeSpecName: "utilities") pod "5c9f0aa2-117e-43e2-aa0d-e296763577bd" (UID: "5c9f0aa2-117e-43e2-aa0d-e296763577bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.771234 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9f0aa2-117e-43e2-aa0d-e296763577bd-kube-api-access-jrqxc" (OuterVolumeSpecName: "kube-api-access-jrqxc") pod "5c9f0aa2-117e-43e2-aa0d-e296763577bd" (UID: "5c9f0aa2-117e-43e2-aa0d-e296763577bd"). InnerVolumeSpecName "kube-api-access-jrqxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.803303 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c9f0aa2-117e-43e2-aa0d-e296763577bd" (UID: "5c9f0aa2-117e-43e2-aa0d-e296763577bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.859960 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.859988 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrqxc\" (UniqueName: \"kubernetes.io/projected/5c9f0aa2-117e-43e2-aa0d-e296763577bd-kube-api-access-jrqxc\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:00 crc kubenswrapper[4909]: I1002 19:12:00.859999 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c9f0aa2-117e-43e2-aa0d-e296763577bd-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.208692 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerID="0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205" exitCode=0 Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.208820 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerDied","Data":"0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205"} Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.208884 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tss9" event={"ID":"5c9f0aa2-117e-43e2-aa0d-e296763577bd","Type":"ContainerDied","Data":"e51cf1550819046bb53d74de920aa4087ba4a734418701fb976bdc927e5b85b6"} Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.208927 4909 scope.go:117] "RemoveContainer" containerID="0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.209571 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tss9" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.243448 4909 scope.go:117] "RemoveContainer" containerID="2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.291332 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7tss9"] Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.299496 4909 scope.go:117] "RemoveContainer" containerID="2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.303148 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7tss9"] Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.366467 4909 scope.go:117] "RemoveContainer" containerID="0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205" Oct 02 19:12:01 crc kubenswrapper[4909]: E1002 19:12:01.366971 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205\": container with ID starting with 0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205 not found: ID does not exist" containerID="0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.367048 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205"} err="failed to get container status \"0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205\": rpc error: code = NotFound desc = could not find container \"0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205\": container with ID starting with 0c8d705143c052b6bd79c7f5f4d68d5cf51d8a9f8f18d28d6499ca30ec1ff205 not found: ID does not exist" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.367093 4909 scope.go:117] "RemoveContainer" containerID="2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b" Oct 02 19:12:01 crc kubenswrapper[4909]: E1002 19:12:01.367658 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b\": container with ID starting with 2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b not found: ID does not exist" containerID="2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.367698 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b"} err="failed to get container status \"2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b\": rpc error: code = NotFound desc = could not find container \"2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b\": container with ID starting with 2952a3779cf242c483e923f4186dfaeb0c608ffe5cadbd600a1d9f24b6843b6b not found: ID does not exist" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.367946 4909 scope.go:117] "RemoveContainer" containerID="2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b" Oct 02 19:12:01 crc kubenswrapper[4909]: E1002 19:12:01.371672 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b\": container with ID starting with 2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b not found: ID does not exist" containerID="2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.371755 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b"} err="failed to get container status \"2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b\": rpc error: code = NotFound desc = could not find container \"2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b\": container with ID starting with 2fa023ab29cecb9049b8fd86597c5305f40a4937360d35778ddb19d7c5cb4c8b not found: ID does not exist" Oct 02 19:12:01 crc kubenswrapper[4909]: I1002 19:12:01.622117 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" path="/var/lib/kubelet/pods/5c9f0aa2-117e-43e2-aa0d-e296763577bd/volumes" Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.485515 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.513111 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.528573 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qb725"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.537554 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.546078 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.556807 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.564499 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zxtgk"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.572449 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.579750 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.599739 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.599802 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n9x86"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.599811 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.613378 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsnx2"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.613431 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gg9gd"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.614871 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:12:10 crc kubenswrapper[4909]: E1002 19:12:10.615411 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.621568 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.629342 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.637950 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.646137 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xm7rx"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.653662 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-slmwb"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.661963 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n9x86"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.669881 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.677910 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zfkjd"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.685386 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.692266 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4qhs5"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.703657 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-t582v"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.710248 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.717111 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7jkvc"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.724506 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-f7fk9"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.732767 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cwlnr"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.739625 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.746478 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-94xbr"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.754761 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xmv8x"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.763180 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r6zfg"] Oct 02 19:12:10 crc kubenswrapper[4909]: I1002 19:12:10.770454 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndwjl"] Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.624225 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8580e1-88af-49f4-aef1-f703b4a72b62" path="/var/lib/kubelet/pods/1a8580e1-88af-49f4-aef1-f703b4a72b62/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.625615 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2204de38-59ac-4528-b7d1-b7ab39dcc238" path="/var/lib/kubelet/pods/2204de38-59ac-4528-b7d1-b7ab39dcc238/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.626680 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267cf7a5-8352-42ca-be8a-588a26e74159" path="/var/lib/kubelet/pods/267cf7a5-8352-42ca-be8a-588a26e74159/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.628007 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d4c5c95-6799-4fa2-a07e-589509ca4f32" path="/var/lib/kubelet/pods/2d4c5c95-6799-4fa2-a07e-589509ca4f32/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.629372 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c4281e-6d35-4eef-8ac9-a86f5cd35fc5" path="/var/lib/kubelet/pods/36c4281e-6d35-4eef-8ac9-a86f5cd35fc5/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.629993 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38813735-9677-4533-a46e-f07e0fe43cdc" path="/var/lib/kubelet/pods/38813735-9677-4533-a46e-f07e0fe43cdc/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.630769 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c49ad32-8f28-4e88-8560-929da446dcdf" path="/var/lib/kubelet/pods/4c49ad32-8f28-4e88-8560-929da446dcdf/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.631835 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8285c9-a626-4f86-813b-4e144be2b061" path="/var/lib/kubelet/pods/4f8285c9-a626-4f86-813b-4e144be2b061/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.632395 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5124cbe9-04cf-4218-b2f2-96c5286e873b" path="/var/lib/kubelet/pods/5124cbe9-04cf-4218-b2f2-96c5286e873b/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.632937 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542798b4-6ef4-45c9-b91e-d692cb757dae" path="/var/lib/kubelet/pods/542798b4-6ef4-45c9-b91e-d692cb757dae/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.634150 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e783af-bb38-4bf8-868c-ed135e64d57f" path="/var/lib/kubelet/pods/80e783af-bb38-4bf8-868c-ed135e64d57f/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.634747 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="866bbad8-4176-4a13-9cbe-674edd9c52bb" path="/var/lib/kubelet/pods/866bbad8-4176-4a13-9cbe-674edd9c52bb/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.635312 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9ee6fa-d742-451b-9c9d-f534cc2bae29" path="/var/lib/kubelet/pods/ab9ee6fa-d742-451b-9c9d-f534cc2bae29/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.635955 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdd9d4a-c13e-4dec-9808-07d9dd4b12ed" path="/var/lib/kubelet/pods/abdd9d4a-c13e-4dec-9808-07d9dd4b12ed/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.637964 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acdf279c-a962-420a-b005-b5a736105600" path="/var/lib/kubelet/pods/acdf279c-a962-420a-b005-b5a736105600/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.639367 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b880f8-829a-4ad8-b7a0-146e92ea1a4b" path="/var/lib/kubelet/pods/d3b880f8-829a-4ad8-b7a0-146e92ea1a4b/volumes" Oct 02 19:12:11 crc kubenswrapper[4909]: I1002 19:12:11.640264 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df963eb6-4e59-4dc7-ab3f-dbb8459276a2" path="/var/lib/kubelet/pods/df963eb6-4e59-4dc7-ab3f-dbb8459276a2/volumes" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.413121 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm"] Oct 02 19:12:15 crc kubenswrapper[4909]: E1002 19:12:15.413758 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="extract-utilities" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.413775 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="extract-utilities" Oct 02 19:12:15 crc kubenswrapper[4909]: E1002 19:12:15.413791 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="registry-server" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.413797 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="registry-server" Oct 02 19:12:15 crc kubenswrapper[4909]: E1002 19:12:15.413832 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="extract-content" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.413842 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="extract-content" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.414088 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9f0aa2-117e-43e2-aa0d-e296763577bd" containerName="registry-server" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.414866 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.416909 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.416953 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.416988 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.417248 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.419215 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.426195 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm"] Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.518275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.518364 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.518397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfdd\" (UniqueName: \"kubernetes.io/projected/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-kube-api-access-pkfdd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.518427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.518624 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.620842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfdd\" (UniqueName: \"kubernetes.io/projected/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-kube-api-access-pkfdd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.620896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.620970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.621083 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.621114 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.627051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.627315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.628488 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.634593 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.638015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfdd\" (UniqueName: \"kubernetes.io/projected/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-kube-api-access-pkfdd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:15 crc kubenswrapper[4909]: I1002 19:12:15.777946 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:16 crc kubenswrapper[4909]: I1002 19:12:16.410749 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm"] Oct 02 19:12:16 crc kubenswrapper[4909]: I1002 19:12:16.440058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" event={"ID":"c1372353-9e7d-4e84-b8b9-44db5e82f9d1","Type":"ContainerStarted","Data":"5f51f74ee370fd9389748ced461c23146c54e703e6987ea53278bc2d32c3b710"} Oct 02 19:12:17 crc kubenswrapper[4909]: I1002 19:12:17.450226 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" event={"ID":"c1372353-9e7d-4e84-b8b9-44db5e82f9d1","Type":"ContainerStarted","Data":"df4fc2fa1d7e37c9ca09c1caa60ba536ad487e7d176400129b4940fdb8cce5bb"} Oct 02 19:12:17 crc kubenswrapper[4909]: I1002 19:12:17.467753 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" podStartSLOduration=2.062079381 podStartE2EDuration="2.467731097s" podCreationTimestamp="2025-10-02 19:12:15 +0000 UTC" firstStartedPulling="2025-10-02 19:12:16.435231579 +0000 UTC m=+3257.622727438" lastFinishedPulling="2025-10-02 19:12:16.840883255 +0000 UTC m=+3258.028379154" observedRunningTime="2025-10-02 19:12:17.464830187 +0000 UTC m=+3258.652326066" watchObservedRunningTime="2025-10-02 19:12:17.467731097 +0000 UTC m=+3258.655226956" Oct 02 19:12:22 crc kubenswrapper[4909]: I1002 19:12:22.609558 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:12:22 crc kubenswrapper[4909]: E1002 19:12:22.610959 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.520505 4909 scope.go:117] "RemoveContainer" containerID="63ef418178edc5628439be39c3c631404343258f5dac21d1e7510d2db85476b8" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.588842 4909 scope.go:117] "RemoveContainer" containerID="c6037637875077f0f9852518f5a97ac57dcac3220d24ca93d3c4314812ea1a6e" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.643250 4909 scope.go:117] "RemoveContainer" containerID="e25c9e65a883b72369e4b3459bcf8998ae45aca51bf838600a5f8c40af608d80" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.724247 4909 scope.go:117] "RemoveContainer" containerID="27cf801efd2f22348cf144e13997d30068e84def0a48004f2064b0e60ad2da5d" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.756570 4909 scope.go:117] "RemoveContainer" containerID="44f1316a2a028c431a9377517d40c47ac031ea9620ef580224bd4b91a4dc3564" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.841159 4909 scope.go:117] "RemoveContainer" containerID="6f62e4484fed2f37efe6e943960688f58f15033ed320adca8f1760bd030b469f" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.884742 4909 scope.go:117] "RemoveContainer" containerID="00af8b52e14541e0304fd45dbe7b1527aa154a1fcd5166d39f0c4c73bfa507de" Oct 02 19:12:23 crc kubenswrapper[4909]: I1002 19:12:23.939283 4909 scope.go:117] "RemoveContainer" containerID="46292d3263fbb72b8552d98183673725927473541db4d877d28b097f1491d4ca" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.016013 4909 scope.go:117] "RemoveContainer" containerID="3b376c0364942d95e88d49b66642f6d0b1664c36b59c4ccfe9c72b5e87a948dd" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.082735 4909 scope.go:117] "RemoveContainer" containerID="255972131ad06ab7721d6ced1cebe0a1c754b43b7b1c0070db376c5605dfd772" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.157734 4909 scope.go:117] "RemoveContainer" containerID="86c7b37c85444e7e05ddbb33e8352baf763989e976e2246f51300b7f0ef77ee3" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.196960 4909 scope.go:117] "RemoveContainer" containerID="d77689118487f1d0d5239c83f4293f549775fe36da20ba11f7cc9d070ca90592" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.250476 4909 scope.go:117] "RemoveContainer" containerID="6465b68ce535665bfa3c4004b1441e4aa1ba1b624ae10d066c64660c792da54c" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.329289 4909 scope.go:117] "RemoveContainer" containerID="3328904eeb1e5fc4a29b42438c52fb135aa466dbbe400cf8f31105cd3a66a05e" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.380913 4909 scope.go:117] "RemoveContainer" containerID="e848759f97d2b6a92addc5dee4821498af6061fb632a26e5e5393c1b9dcfa374" Oct 02 19:12:24 crc kubenswrapper[4909]: I1002 19:12:24.427245 4909 scope.go:117] "RemoveContainer" containerID="f1233fe3c0905275bb9f9d380544c30bba0c3c395f5d3f86c0de77ccfd7c2243" Oct 02 19:12:30 crc kubenswrapper[4909]: I1002 19:12:30.644174 4909 generic.go:334] "Generic (PLEG): container finished" podID="c1372353-9e7d-4e84-b8b9-44db5e82f9d1" containerID="df4fc2fa1d7e37c9ca09c1caa60ba536ad487e7d176400129b4940fdb8cce5bb" exitCode=0 Oct 02 19:12:30 crc kubenswrapper[4909]: I1002 19:12:30.644268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" event={"ID":"c1372353-9e7d-4e84-b8b9-44db5e82f9d1","Type":"ContainerDied","Data":"df4fc2fa1d7e37c9ca09c1caa60ba536ad487e7d176400129b4940fdb8cce5bb"} Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.158271 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.296674 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ssh-key\") pod \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.296770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfdd\" (UniqueName: \"kubernetes.io/projected/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-kube-api-access-pkfdd\") pod \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.296797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ceph\") pod \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.296869 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-inventory\") pod \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.297050 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-repo-setup-combined-ca-bundle\") pod \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\" (UID: \"c1372353-9e7d-4e84-b8b9-44db5e82f9d1\") " Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.302494 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-kube-api-access-pkfdd" (OuterVolumeSpecName: "kube-api-access-pkfdd") pod "c1372353-9e7d-4e84-b8b9-44db5e82f9d1" (UID: "c1372353-9e7d-4e84-b8b9-44db5e82f9d1"). InnerVolumeSpecName "kube-api-access-pkfdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.302961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c1372353-9e7d-4e84-b8b9-44db5e82f9d1" (UID: "c1372353-9e7d-4e84-b8b9-44db5e82f9d1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.305335 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ceph" (OuterVolumeSpecName: "ceph") pod "c1372353-9e7d-4e84-b8b9-44db5e82f9d1" (UID: "c1372353-9e7d-4e84-b8b9-44db5e82f9d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.349700 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c1372353-9e7d-4e84-b8b9-44db5e82f9d1" (UID: "c1372353-9e7d-4e84-b8b9-44db5e82f9d1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.354426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-inventory" (OuterVolumeSpecName: "inventory") pod "c1372353-9e7d-4e84-b8b9-44db5e82f9d1" (UID: "c1372353-9e7d-4e84-b8b9-44db5e82f9d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.398721 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.398755 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.398768 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfdd\" (UniqueName: \"kubernetes.io/projected/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-kube-api-access-pkfdd\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.398779 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.398788 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1372353-9e7d-4e84-b8b9-44db5e82f9d1-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.675807 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" event={"ID":"c1372353-9e7d-4e84-b8b9-44db5e82f9d1","Type":"ContainerDied","Data":"5f51f74ee370fd9389748ced461c23146c54e703e6987ea53278bc2d32c3b710"} Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.675853 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f51f74ee370fd9389748ced461c23146c54e703e6987ea53278bc2d32c3b710" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.675890 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.747589 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2"] Oct 02 19:12:32 crc kubenswrapper[4909]: E1002 19:12:32.748241 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1372353-9e7d-4e84-b8b9-44db5e82f9d1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.748272 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1372353-9e7d-4e84-b8b9-44db5e82f9d1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.748677 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1372353-9e7d-4e84-b8b9-44db5e82f9d1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.750049 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.754297 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.754526 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.754579 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.755404 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.756316 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.777874 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2"] Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.909960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.910206 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.910428 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.910521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:32 crc kubenswrapper[4909]: I1002 19:12:32.910670 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9ghz\" (UniqueName: \"kubernetes.io/projected/50be6a2c-c436-47ba-a5c7-d2cb51151c09-kube-api-access-t9ghz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.012833 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.012999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.013118 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.013208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9ghz\" (UniqueName: \"kubernetes.io/projected/50be6a2c-c436-47ba-a5c7-d2cb51151c09-kube-api-access-t9ghz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.013276 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.019067 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.019309 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.019697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.019720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.045004 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9ghz\" (UniqueName: \"kubernetes.io/projected/50be6a2c-c436-47ba-a5c7-d2cb51151c09-kube-api-access-t9ghz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.072195 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:12:33 crc kubenswrapper[4909]: I1002 19:12:33.714728 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2"] Oct 02 19:12:33 crc kubenswrapper[4909]: W1002 19:12:33.726408 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50be6a2c_c436_47ba_a5c7_d2cb51151c09.slice/crio-fc54bcd7773e96857103f74fdf41204c732374f18fb22f571510d5bd8fd56fff WatchSource:0}: Error finding container fc54bcd7773e96857103f74fdf41204c732374f18fb22f571510d5bd8fd56fff: Status 404 returned error can't find the container with id fc54bcd7773e96857103f74fdf41204c732374f18fb22f571510d5bd8fd56fff Oct 02 19:12:34 crc kubenswrapper[4909]: I1002 19:12:34.702017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" event={"ID":"50be6a2c-c436-47ba-a5c7-d2cb51151c09","Type":"ContainerStarted","Data":"fc54bcd7773e96857103f74fdf41204c732374f18fb22f571510d5bd8fd56fff"} Oct 02 19:12:35 crc kubenswrapper[4909]: I1002 19:12:35.609645 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:12:35 crc kubenswrapper[4909]: E1002 19:12:35.610535 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:12:35 crc kubenswrapper[4909]: I1002 19:12:35.716753 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" event={"ID":"50be6a2c-c436-47ba-a5c7-d2cb51151c09","Type":"ContainerStarted","Data":"ed566e5c94fa933dd8f573bca0f80afb290e8ff3f00aadaa8838be3c6d87493c"} Oct 02 19:12:35 crc kubenswrapper[4909]: I1002 19:12:35.738423 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" podStartSLOduration=3.052981192 podStartE2EDuration="3.738397567s" podCreationTimestamp="2025-10-02 19:12:32 +0000 UTC" firstStartedPulling="2025-10-02 19:12:33.731520902 +0000 UTC m=+3274.919016781" lastFinishedPulling="2025-10-02 19:12:34.416937297 +0000 UTC m=+3275.604433156" observedRunningTime="2025-10-02 19:12:35.7365706 +0000 UTC m=+3276.924066499" watchObservedRunningTime="2025-10-02 19:12:35.738397567 +0000 UTC m=+3276.925893476" Oct 02 19:12:48 crc kubenswrapper[4909]: I1002 19:12:48.608916 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:12:48 crc kubenswrapper[4909]: E1002 19:12:48.610294 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:13:00 crc kubenswrapper[4909]: I1002 19:13:00.608636 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:13:00 crc kubenswrapper[4909]: E1002 19:13:00.609253 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:13:13 crc kubenswrapper[4909]: I1002 19:13:13.609546 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:13:13 crc kubenswrapper[4909]: E1002 19:13:13.610405 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:13:24 crc kubenswrapper[4909]: I1002 19:13:24.817655 4909 scope.go:117] "RemoveContainer" containerID="881181012c7824ed02063dec31ff69f0aec758bf35bdbc7627e34baeb1daedd1" Oct 02 19:13:26 crc kubenswrapper[4909]: I1002 19:13:26.608813 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:13:27 crc kubenswrapper[4909]: I1002 19:13:27.371456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"b79c137b5e9368ee0f969f855be8e9f5d409f78db2ec6b15925b5fbc9ffd0f7d"} Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.738282 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sm6rt"] Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.741637 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.746539 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm6rt"] Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.815374 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-utilities\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.815469 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d5bk\" (UniqueName: \"kubernetes.io/projected/2600871d-3a18-4848-a9ab-01df656d0296-kube-api-access-2d5bk\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.815516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-catalog-content\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.918082 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-utilities\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.918156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d5bk\" (UniqueName: \"kubernetes.io/projected/2600871d-3a18-4848-a9ab-01df656d0296-kube-api-access-2d5bk\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.918203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-catalog-content\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.918662 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-utilities\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.918745 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-catalog-content\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:42 crc kubenswrapper[4909]: I1002 19:13:42.937979 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d5bk\" (UniqueName: \"kubernetes.io/projected/2600871d-3a18-4848-a9ab-01df656d0296-kube-api-access-2d5bk\") pod \"redhat-marketplace-sm6rt\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:43 crc kubenswrapper[4909]: I1002 19:13:43.073292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:43 crc kubenswrapper[4909]: I1002 19:13:43.554595 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm6rt"] Oct 02 19:13:44 crc kubenswrapper[4909]: I1002 19:13:44.554701 4909 generic.go:334] "Generic (PLEG): container finished" podID="2600871d-3a18-4848-a9ab-01df656d0296" containerID="231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515" exitCode=0 Oct 02 19:13:44 crc kubenswrapper[4909]: I1002 19:13:44.554782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm6rt" event={"ID":"2600871d-3a18-4848-a9ab-01df656d0296","Type":"ContainerDied","Data":"231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515"} Oct 02 19:13:44 crc kubenswrapper[4909]: I1002 19:13:44.555159 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm6rt" event={"ID":"2600871d-3a18-4848-a9ab-01df656d0296","Type":"ContainerStarted","Data":"89be1be08bc3b2c7176c862b22dfcc533efc176bdb86dc3cba8482d127005552"} Oct 02 19:13:44 crc kubenswrapper[4909]: I1002 19:13:44.557705 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:13:46 crc kubenswrapper[4909]: I1002 19:13:46.586085 4909 generic.go:334] "Generic (PLEG): container finished" podID="2600871d-3a18-4848-a9ab-01df656d0296" containerID="cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac" exitCode=0 Oct 02 19:13:46 crc kubenswrapper[4909]: I1002 19:13:46.586704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm6rt" event={"ID":"2600871d-3a18-4848-a9ab-01df656d0296","Type":"ContainerDied","Data":"cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac"} Oct 02 19:13:47 crc kubenswrapper[4909]: I1002 19:13:47.597921 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm6rt" event={"ID":"2600871d-3a18-4848-a9ab-01df656d0296","Type":"ContainerStarted","Data":"e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7"} Oct 02 19:13:47 crc kubenswrapper[4909]: I1002 19:13:47.628065 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sm6rt" podStartSLOduration=3.028364183 podStartE2EDuration="5.628003665s" podCreationTimestamp="2025-10-02 19:13:42 +0000 UTC" firstStartedPulling="2025-10-02 19:13:44.557466628 +0000 UTC m=+3345.744962487" lastFinishedPulling="2025-10-02 19:13:47.15710607 +0000 UTC m=+3348.344601969" observedRunningTime="2025-10-02 19:13:47.616122938 +0000 UTC m=+3348.803618807" watchObservedRunningTime="2025-10-02 19:13:47.628003665 +0000 UTC m=+3348.815499524" Oct 02 19:13:53 crc kubenswrapper[4909]: I1002 19:13:53.073940 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:53 crc kubenswrapper[4909]: I1002 19:13:53.074502 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:53 crc kubenswrapper[4909]: I1002 19:13:53.142669 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:53 crc kubenswrapper[4909]: I1002 19:13:53.737439 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:53 crc kubenswrapper[4909]: I1002 19:13:53.800723 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm6rt"] Oct 02 19:13:55 crc kubenswrapper[4909]: I1002 19:13:55.689942 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sm6rt" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="registry-server" containerID="cri-o://e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7" gracePeriod=2 Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.251954 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.315655 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-catalog-content\") pod \"2600871d-3a18-4848-a9ab-01df656d0296\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.315779 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d5bk\" (UniqueName: \"kubernetes.io/projected/2600871d-3a18-4848-a9ab-01df656d0296-kube-api-access-2d5bk\") pod \"2600871d-3a18-4848-a9ab-01df656d0296\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.315829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-utilities\") pod \"2600871d-3a18-4848-a9ab-01df656d0296\" (UID: \"2600871d-3a18-4848-a9ab-01df656d0296\") " Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.316618 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-utilities" (OuterVolumeSpecName: "utilities") pod "2600871d-3a18-4848-a9ab-01df656d0296" (UID: "2600871d-3a18-4848-a9ab-01df656d0296"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.324020 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2600871d-3a18-4848-a9ab-01df656d0296-kube-api-access-2d5bk" (OuterVolumeSpecName: "kube-api-access-2d5bk") pod "2600871d-3a18-4848-a9ab-01df656d0296" (UID: "2600871d-3a18-4848-a9ab-01df656d0296"). InnerVolumeSpecName "kube-api-access-2d5bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.335811 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2600871d-3a18-4848-a9ab-01df656d0296" (UID: "2600871d-3a18-4848-a9ab-01df656d0296"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.417592 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.417631 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d5bk\" (UniqueName: \"kubernetes.io/projected/2600871d-3a18-4848-a9ab-01df656d0296-kube-api-access-2d5bk\") on node \"crc\" DevicePath \"\"" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.417646 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2600871d-3a18-4848-a9ab-01df656d0296-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.701936 4909 generic.go:334] "Generic (PLEG): container finished" podID="2600871d-3a18-4848-a9ab-01df656d0296" containerID="e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7" exitCode=0 Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.702007 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm6rt" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.702008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm6rt" event={"ID":"2600871d-3a18-4848-a9ab-01df656d0296","Type":"ContainerDied","Data":"e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7"} Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.702482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm6rt" event={"ID":"2600871d-3a18-4848-a9ab-01df656d0296","Type":"ContainerDied","Data":"89be1be08bc3b2c7176c862b22dfcc533efc176bdb86dc3cba8482d127005552"} Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.702525 4909 scope.go:117] "RemoveContainer" containerID="e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.726571 4909 scope.go:117] "RemoveContainer" containerID="cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.738459 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm6rt"] Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.746270 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm6rt"] Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.773981 4909 scope.go:117] "RemoveContainer" containerID="231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.798964 4909 scope.go:117] "RemoveContainer" containerID="e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7" Oct 02 19:13:56 crc kubenswrapper[4909]: E1002 19:13:56.799415 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7\": container with ID starting with e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7 not found: ID does not exist" containerID="e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.799480 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7"} err="failed to get container status \"e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7\": rpc error: code = NotFound desc = could not find container \"e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7\": container with ID starting with e7287365a554657d833f004e83ec13ec9ba4f74d0fe0c962622d3d9e353754e7 not found: ID does not exist" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.799511 4909 scope.go:117] "RemoveContainer" containerID="cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac" Oct 02 19:13:56 crc kubenswrapper[4909]: E1002 19:13:56.799898 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac\": container with ID starting with cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac not found: ID does not exist" containerID="cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.799954 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac"} err="failed to get container status \"cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac\": rpc error: code = NotFound desc = could not find container \"cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac\": container with ID starting with cc350f5efccf03e7c720f8be3c3ac5dfd76978f5300dca18dfbceafb309dc0ac not found: ID does not exist" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.799993 4909 scope.go:117] "RemoveContainer" containerID="231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515" Oct 02 19:13:56 crc kubenswrapper[4909]: E1002 19:13:56.800478 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515\": container with ID starting with 231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515 not found: ID does not exist" containerID="231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515" Oct 02 19:13:56 crc kubenswrapper[4909]: I1002 19:13:56.800522 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515"} err="failed to get container status \"231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515\": rpc error: code = NotFound desc = could not find container \"231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515\": container with ID starting with 231a1776ba2ab88a24268ec7bd363d7c91d7a84121a8fabcbfafdbcb0199d515 not found: ID does not exist" Oct 02 19:13:57 crc kubenswrapper[4909]: I1002 19:13:57.624806 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2600871d-3a18-4848-a9ab-01df656d0296" path="/var/lib/kubelet/pods/2600871d-3a18-4848-a9ab-01df656d0296/volumes" Oct 02 19:14:28 crc kubenswrapper[4909]: I1002 19:14:28.040964 4909 generic.go:334] "Generic (PLEG): container finished" podID="50be6a2c-c436-47ba-a5c7-d2cb51151c09" containerID="ed566e5c94fa933dd8f573bca0f80afb290e8ff3f00aadaa8838be3c6d87493c" exitCode=0 Oct 02 19:14:28 crc kubenswrapper[4909]: I1002 19:14:28.041058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" event={"ID":"50be6a2c-c436-47ba-a5c7-d2cb51151c09","Type":"ContainerDied","Data":"ed566e5c94fa933dd8f573bca0f80afb290e8ff3f00aadaa8838be3c6d87493c"} Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.557161 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.705697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ceph\") pod \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.705795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9ghz\" (UniqueName: \"kubernetes.io/projected/50be6a2c-c436-47ba-a5c7-d2cb51151c09-kube-api-access-t9ghz\") pod \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.705831 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-bootstrap-combined-ca-bundle\") pod \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.706120 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ssh-key\") pod \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.706233 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-inventory\") pod \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\" (UID: \"50be6a2c-c436-47ba-a5c7-d2cb51151c09\") " Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.711653 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50be6a2c-c436-47ba-a5c7-d2cb51151c09-kube-api-access-t9ghz" (OuterVolumeSpecName: "kube-api-access-t9ghz") pod "50be6a2c-c436-47ba-a5c7-d2cb51151c09" (UID: "50be6a2c-c436-47ba-a5c7-d2cb51151c09"). InnerVolumeSpecName "kube-api-access-t9ghz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.711743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ceph" (OuterVolumeSpecName: "ceph") pod "50be6a2c-c436-47ba-a5c7-d2cb51151c09" (UID: "50be6a2c-c436-47ba-a5c7-d2cb51151c09"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.713477 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "50be6a2c-c436-47ba-a5c7-d2cb51151c09" (UID: "50be6a2c-c436-47ba-a5c7-d2cb51151c09"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.742403 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-inventory" (OuterVolumeSpecName: "inventory") pod "50be6a2c-c436-47ba-a5c7-d2cb51151c09" (UID: "50be6a2c-c436-47ba-a5c7-d2cb51151c09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.745637 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50be6a2c-c436-47ba-a5c7-d2cb51151c09" (UID: "50be6a2c-c436-47ba-a5c7-d2cb51151c09"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.809207 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.809570 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.809583 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.809596 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9ghz\" (UniqueName: \"kubernetes.io/projected/50be6a2c-c436-47ba-a5c7-d2cb51151c09-kube-api-access-t9ghz\") on node \"crc\" DevicePath \"\"" Oct 02 19:14:29 crc kubenswrapper[4909]: I1002 19:14:29.809611 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be6a2c-c436-47ba-a5c7-d2cb51151c09-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.067000 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" event={"ID":"50be6a2c-c436-47ba-a5c7-d2cb51151c09","Type":"ContainerDied","Data":"fc54bcd7773e96857103f74fdf41204c732374f18fb22f571510d5bd8fd56fff"} Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.067104 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc54bcd7773e96857103f74fdf41204c732374f18fb22f571510d5bd8fd56fff" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.067138 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.147727 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67"] Oct 02 19:14:30 crc kubenswrapper[4909]: E1002 19:14:30.148197 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="extract-content" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.148215 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="extract-content" Oct 02 19:14:30 crc kubenswrapper[4909]: E1002 19:14:30.148223 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="registry-server" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.148231 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="registry-server" Oct 02 19:14:30 crc kubenswrapper[4909]: E1002 19:14:30.148264 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="extract-utilities" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.148272 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="extract-utilities" Oct 02 19:14:30 crc kubenswrapper[4909]: E1002 19:14:30.148286 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50be6a2c-c436-47ba-a5c7-d2cb51151c09" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.148294 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="50be6a2c-c436-47ba-a5c7-d2cb51151c09" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.148489 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="50be6a2c-c436-47ba-a5c7-d2cb51151c09" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.148522 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2600871d-3a18-4848-a9ab-01df656d0296" containerName="registry-server" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.149200 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.152305 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.152709 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.152758 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.152820 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.153077 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.165200 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67"] Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.218818 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.218888 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.219014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77mld\" (UniqueName: \"kubernetes.io/projected/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-kube-api-access-77mld\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.219060 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.320602 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.320855 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77mld\" (UniqueName: \"kubernetes.io/projected/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-kube-api-access-77mld\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.320909 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.320954 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.324490 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.325514 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.327188 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.340966 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77mld\" (UniqueName: \"kubernetes.io/projected/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-kube-api-access-77mld\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bw67\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:30 crc kubenswrapper[4909]: I1002 19:14:30.506312 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:14:31 crc kubenswrapper[4909]: I1002 19:14:31.072509 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67"] Oct 02 19:14:32 crc kubenswrapper[4909]: I1002 19:14:32.093052 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" event={"ID":"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955","Type":"ContainerStarted","Data":"7677ecb8007cf8f2a978540c5dc0bec6973092d8c9b5ece60273f379aa4035f2"} Oct 02 19:14:32 crc kubenswrapper[4909]: I1002 19:14:32.093389 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" event={"ID":"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955","Type":"ContainerStarted","Data":"e68eabd3b29807c043c6d2721f7f30b93b6f970bdb824dbff31995bf27d32f84"} Oct 02 19:14:32 crc kubenswrapper[4909]: I1002 19:14:32.111745 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" podStartSLOduration=1.3981392480000001 podStartE2EDuration="2.111726184s" podCreationTimestamp="2025-10-02 19:14:30 +0000 UTC" firstStartedPulling="2025-10-02 19:14:31.084134458 +0000 UTC m=+3392.271630317" lastFinishedPulling="2025-10-02 19:14:31.797721394 +0000 UTC m=+3392.985217253" observedRunningTime="2025-10-02 19:14:32.108194454 +0000 UTC m=+3393.295690313" watchObservedRunningTime="2025-10-02 19:14:32.111726184 +0000 UTC m=+3393.299222053" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.197962 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r"] Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.201325 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.207927 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.208246 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.211941 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r"] Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.332701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956f29e1-a716-47ea-b00a-4827ce24cacf-secret-volume\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.332776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5g8\" (UniqueName: \"kubernetes.io/projected/956f29e1-a716-47ea-b00a-4827ce24cacf-kube-api-access-nj5g8\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.332850 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956f29e1-a716-47ea-b00a-4827ce24cacf-config-volume\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.434905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956f29e1-a716-47ea-b00a-4827ce24cacf-secret-volume\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.435108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5g8\" (UniqueName: \"kubernetes.io/projected/956f29e1-a716-47ea-b00a-4827ce24cacf-kube-api-access-nj5g8\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.435214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956f29e1-a716-47ea-b00a-4827ce24cacf-config-volume\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.436910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956f29e1-a716-47ea-b00a-4827ce24cacf-config-volume\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.442122 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956f29e1-a716-47ea-b00a-4827ce24cacf-secret-volume\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.456113 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5g8\" (UniqueName: \"kubernetes.io/projected/956f29e1-a716-47ea-b00a-4827ce24cacf-kube-api-access-nj5g8\") pod \"collect-profiles-29323875-m6x8r\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:00 crc kubenswrapper[4909]: I1002 19:15:00.546521 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:01 crc kubenswrapper[4909]: I1002 19:15:01.014698 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r"] Oct 02 19:15:01 crc kubenswrapper[4909]: I1002 19:15:01.423611 4909 generic.go:334] "Generic (PLEG): container finished" podID="956f29e1-a716-47ea-b00a-4827ce24cacf" containerID="62b922570371ced9f422dfa4e1af422fcedd70badf9b7734e27dca00cd0ef652" exitCode=0 Oct 02 19:15:01 crc kubenswrapper[4909]: I1002 19:15:01.423646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" event={"ID":"956f29e1-a716-47ea-b00a-4827ce24cacf","Type":"ContainerDied","Data":"62b922570371ced9f422dfa4e1af422fcedd70badf9b7734e27dca00cd0ef652"} Oct 02 19:15:01 crc kubenswrapper[4909]: I1002 19:15:01.423685 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" event={"ID":"956f29e1-a716-47ea-b00a-4827ce24cacf","Type":"ContainerStarted","Data":"281070cb289f1e64d8eba4d943234196be4a7bb1de3e2f11a1c6cb4876efb187"} Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.895616 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.987678 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956f29e1-a716-47ea-b00a-4827ce24cacf-secret-volume\") pod \"956f29e1-a716-47ea-b00a-4827ce24cacf\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.987814 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj5g8\" (UniqueName: \"kubernetes.io/projected/956f29e1-a716-47ea-b00a-4827ce24cacf-kube-api-access-nj5g8\") pod \"956f29e1-a716-47ea-b00a-4827ce24cacf\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.987850 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956f29e1-a716-47ea-b00a-4827ce24cacf-config-volume\") pod \"956f29e1-a716-47ea-b00a-4827ce24cacf\" (UID: \"956f29e1-a716-47ea-b00a-4827ce24cacf\") " Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.988924 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956f29e1-a716-47ea-b00a-4827ce24cacf-config-volume" (OuterVolumeSpecName: "config-volume") pod "956f29e1-a716-47ea-b00a-4827ce24cacf" (UID: "956f29e1-a716-47ea-b00a-4827ce24cacf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.994682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956f29e1-a716-47ea-b00a-4827ce24cacf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "956f29e1-a716-47ea-b00a-4827ce24cacf" (UID: "956f29e1-a716-47ea-b00a-4827ce24cacf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:02 crc kubenswrapper[4909]: I1002 19:15:02.994970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956f29e1-a716-47ea-b00a-4827ce24cacf-kube-api-access-nj5g8" (OuterVolumeSpecName: "kube-api-access-nj5g8") pod "956f29e1-a716-47ea-b00a-4827ce24cacf" (UID: "956f29e1-a716-47ea-b00a-4827ce24cacf"). InnerVolumeSpecName "kube-api-access-nj5g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:15:03 crc kubenswrapper[4909]: I1002 19:15:03.090988 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956f29e1-a716-47ea-b00a-4827ce24cacf-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:03 crc kubenswrapper[4909]: I1002 19:15:03.091085 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj5g8\" (UniqueName: \"kubernetes.io/projected/956f29e1-a716-47ea-b00a-4827ce24cacf-kube-api-access-nj5g8\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:03 crc kubenswrapper[4909]: I1002 19:15:03.091111 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956f29e1-a716-47ea-b00a-4827ce24cacf-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:03 crc kubenswrapper[4909]: I1002 19:15:03.447962 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" event={"ID":"956f29e1-a716-47ea-b00a-4827ce24cacf","Type":"ContainerDied","Data":"281070cb289f1e64d8eba4d943234196be4a7bb1de3e2f11a1c6cb4876efb187"} Oct 02 19:15:03 crc kubenswrapper[4909]: I1002 19:15:03.448387 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="281070cb289f1e64d8eba4d943234196be4a7bb1de3e2f11a1c6cb4876efb187" Oct 02 19:15:03 crc kubenswrapper[4909]: I1002 19:15:03.447991 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r" Oct 02 19:15:04 crc kubenswrapper[4909]: I1002 19:15:04.004322 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4"] Oct 02 19:15:04 crc kubenswrapper[4909]: I1002 19:15:04.016135 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323830-8kpp4"] Oct 02 19:15:04 crc kubenswrapper[4909]: I1002 19:15:04.460928 4909 generic.go:334] "Generic (PLEG): container finished" podID="1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" containerID="7677ecb8007cf8f2a978540c5dc0bec6973092d8c9b5ece60273f379aa4035f2" exitCode=0 Oct 02 19:15:04 crc kubenswrapper[4909]: I1002 19:15:04.460999 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" event={"ID":"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955","Type":"ContainerDied","Data":"7677ecb8007cf8f2a978540c5dc0bec6973092d8c9b5ece60273f379aa4035f2"} Oct 02 19:15:05 crc kubenswrapper[4909]: I1002 19:15:05.626345 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d661e3c-9a8e-4823-98c0-7abf07813d89" path="/var/lib/kubelet/pods/7d661e3c-9a8e-4823-98c0-7abf07813d89/volumes" Oct 02 19:15:05 crc kubenswrapper[4909]: I1002 19:15:05.994369 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.057212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ssh-key\") pod \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.057302 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77mld\" (UniqueName: \"kubernetes.io/projected/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-kube-api-access-77mld\") pod \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.057345 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-inventory\") pod \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.057390 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ceph\") pod \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\" (UID: \"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955\") " Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.063769 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ceph" (OuterVolumeSpecName: "ceph") pod "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" (UID: "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.065153 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-kube-api-access-77mld" (OuterVolumeSpecName: "kube-api-access-77mld") pod "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" (UID: "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955"). InnerVolumeSpecName "kube-api-access-77mld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.106879 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-inventory" (OuterVolumeSpecName: "inventory") pod "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" (UID: "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.112071 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" (UID: "1ee1ee0b-fca9-4cc7-8016-e8a623cb5955"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.161713 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.161792 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77mld\" (UniqueName: \"kubernetes.io/projected/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-kube-api-access-77mld\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.161813 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.161859 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ee1ee0b-fca9-4cc7-8016-e8a623cb5955-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.485892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" event={"ID":"1ee1ee0b-fca9-4cc7-8016-e8a623cb5955","Type":"ContainerDied","Data":"e68eabd3b29807c043c6d2721f7f30b93b6f970bdb824dbff31995bf27d32f84"} Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.486228 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e68eabd3b29807c043c6d2721f7f30b93b6f970bdb824dbff31995bf27d32f84" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.485981 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bw67" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.585484 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n"] Oct 02 19:15:06 crc kubenswrapper[4909]: E1002 19:15:06.586368 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956f29e1-a716-47ea-b00a-4827ce24cacf" containerName="collect-profiles" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.586395 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="956f29e1-a716-47ea-b00a-4827ce24cacf" containerName="collect-profiles" Oct 02 19:15:06 crc kubenswrapper[4909]: E1002 19:15:06.586424 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.586434 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.586846 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee1ee0b-fca9-4cc7-8016-e8a623cb5955" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.586880 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="956f29e1-a716-47ea-b00a-4827ce24cacf" containerName="collect-profiles" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.588002 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.592609 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.592868 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.593107 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.597484 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n"] Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.598468 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.603060 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.676232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.676807 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.677044 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v655k\" (UniqueName: \"kubernetes.io/projected/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-kube-api-access-v655k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.677130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.778657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.778761 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v655k\" (UniqueName: \"kubernetes.io/projected/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-kube-api-access-v655k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.778806 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.778866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.783194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.783268 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.783931 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.801793 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v655k\" (UniqueName: \"kubernetes.io/projected/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-kube-api-access-v655k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:06 crc kubenswrapper[4909]: I1002 19:15:06.927014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:07 crc kubenswrapper[4909]: I1002 19:15:07.347247 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n"] Oct 02 19:15:07 crc kubenswrapper[4909]: I1002 19:15:07.519595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" event={"ID":"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20","Type":"ContainerStarted","Data":"007c06ce139449198ddb83e0281cc70174dc85e56a7c33835101fe838e848c0f"} Oct 02 19:15:08 crc kubenswrapper[4909]: I1002 19:15:08.532241 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" event={"ID":"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20","Type":"ContainerStarted","Data":"ab752dd28c19d60cc38837ca8ad7f9212739a815d70961692b32751a3c0affee"} Oct 02 19:15:08 crc kubenswrapper[4909]: I1002 19:15:08.566413 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" podStartSLOduration=2.014724786 podStartE2EDuration="2.566388031s" podCreationTimestamp="2025-10-02 19:15:06 +0000 UTC" firstStartedPulling="2025-10-02 19:15:07.347441924 +0000 UTC m=+3428.534937813" lastFinishedPulling="2025-10-02 19:15:07.899105199 +0000 UTC m=+3429.086601058" observedRunningTime="2025-10-02 19:15:08.554729891 +0000 UTC m=+3429.742225750" watchObservedRunningTime="2025-10-02 19:15:08.566388031 +0000 UTC m=+3429.753883890" Oct 02 19:15:15 crc kubenswrapper[4909]: I1002 19:15:15.619713 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" containerID="ab752dd28c19d60cc38837ca8ad7f9212739a815d70961692b32751a3c0affee" exitCode=0 Oct 02 19:15:15 crc kubenswrapper[4909]: I1002 19:15:15.632617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" event={"ID":"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20","Type":"ContainerDied","Data":"ab752dd28c19d60cc38837ca8ad7f9212739a815d70961692b32751a3c0affee"} Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.212633 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.360566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ceph\") pod \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.360644 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v655k\" (UniqueName: \"kubernetes.io/projected/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-kube-api-access-v655k\") pod \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.360727 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ssh-key\") pod \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.360838 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-inventory\") pod \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\" (UID: \"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20\") " Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.366944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-kube-api-access-v655k" (OuterVolumeSpecName: "kube-api-access-v655k") pod "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" (UID: "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20"). InnerVolumeSpecName "kube-api-access-v655k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.367537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ceph" (OuterVolumeSpecName: "ceph") pod "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" (UID: "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.392646 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" (UID: "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.413839 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-inventory" (OuterVolumeSpecName: "inventory") pod "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" (UID: "9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.463227 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.463289 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v655k\" (UniqueName: \"kubernetes.io/projected/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-kube-api-access-v655k\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.463311 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.463329 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.653127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" event={"ID":"9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20","Type":"ContainerDied","Data":"007c06ce139449198ddb83e0281cc70174dc85e56a7c33835101fe838e848c0f"} Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.653504 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="007c06ce139449198ddb83e0281cc70174dc85e56a7c33835101fe838e848c0f" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.653180 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.741410 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm"] Oct 02 19:15:17 crc kubenswrapper[4909]: E1002 19:15:17.741903 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.741939 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.742191 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.743017 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.745011 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.745581 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.746389 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.746654 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.754937 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.762975 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm"] Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.779556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqf62\" (UniqueName: \"kubernetes.io/projected/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-kube-api-access-sqf62\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.779711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.779826 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.780090 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.881510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.881814 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqf62\" (UniqueName: \"kubernetes.io/projected/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-kube-api-access-sqf62\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.881886 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.881960 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.885604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.885886 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.888367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:17 crc kubenswrapper[4909]: I1002 19:15:17.897274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqf62\" (UniqueName: \"kubernetes.io/projected/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-kube-api-access-sqf62\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c2klm\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:18 crc kubenswrapper[4909]: I1002 19:15:18.080263 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:15:18 crc kubenswrapper[4909]: I1002 19:15:18.683158 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm"] Oct 02 19:15:18 crc kubenswrapper[4909]: W1002 19:15:18.689510 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ba2b75_7e2f_43f7_a72e_f5cdad40cc55.slice/crio-e5e78a5c1d7ae430a0c8d664cfedfb69d7cc4e0065b7448998fc6a02eaa22c2f WatchSource:0}: Error finding container e5e78a5c1d7ae430a0c8d664cfedfb69d7cc4e0065b7448998fc6a02eaa22c2f: Status 404 returned error can't find the container with id e5e78a5c1d7ae430a0c8d664cfedfb69d7cc4e0065b7448998fc6a02eaa22c2f Oct 02 19:15:19 crc kubenswrapper[4909]: I1002 19:15:19.679873 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" event={"ID":"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55","Type":"ContainerStarted","Data":"9447ae8327c014065500f859c2a2947006cdccfb987728917cf4c441b93da6d3"} Oct 02 19:15:19 crc kubenswrapper[4909]: I1002 19:15:19.680569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" event={"ID":"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55","Type":"ContainerStarted","Data":"e5e78a5c1d7ae430a0c8d664cfedfb69d7cc4e0065b7448998fc6a02eaa22c2f"} Oct 02 19:15:24 crc kubenswrapper[4909]: I1002 19:15:24.988312 4909 scope.go:117] "RemoveContainer" containerID="332a3f19c434e5b1f11cdc97c013bf23272ac622306facdbffbaba594f650af7" Oct 02 19:15:53 crc kubenswrapper[4909]: I1002 19:15:53.055078 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:15:53 crc kubenswrapper[4909]: I1002 19:15:53.055569 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:16:09 crc kubenswrapper[4909]: I1002 19:16:09.295627 4909 generic.go:334] "Generic (PLEG): container finished" podID="a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" containerID="9447ae8327c014065500f859c2a2947006cdccfb987728917cf4c441b93da6d3" exitCode=0 Oct 02 19:16:09 crc kubenswrapper[4909]: I1002 19:16:09.295749 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" event={"ID":"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55","Type":"ContainerDied","Data":"9447ae8327c014065500f859c2a2947006cdccfb987728917cf4c441b93da6d3"} Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.809829 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.930970 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ceph\") pod \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.931129 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-inventory\") pod \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.931393 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqf62\" (UniqueName: \"kubernetes.io/projected/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-kube-api-access-sqf62\") pod \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.931482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ssh-key\") pod \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\" (UID: \"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55\") " Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.938516 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-kube-api-access-sqf62" (OuterVolumeSpecName: "kube-api-access-sqf62") pod "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" (UID: "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55"). InnerVolumeSpecName "kube-api-access-sqf62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.940254 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ceph" (OuterVolumeSpecName: "ceph") pod "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" (UID: "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.981336 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-inventory" (OuterVolumeSpecName: "inventory") pod "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" (UID: "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:16:10 crc kubenswrapper[4909]: I1002 19:16:10.985339 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" (UID: "a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.033660 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqf62\" (UniqueName: \"kubernetes.io/projected/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-kube-api-access-sqf62\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.033691 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.033701 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.033711 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.322928 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" event={"ID":"a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55","Type":"ContainerDied","Data":"e5e78a5c1d7ae430a0c8d664cfedfb69d7cc4e0065b7448998fc6a02eaa22c2f"} Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.322992 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e78a5c1d7ae430a0c8d664cfedfb69d7cc4e0065b7448998fc6a02eaa22c2f" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.323071 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c2klm" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.423017 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg"] Oct 02 19:16:11 crc kubenswrapper[4909]: E1002 19:16:11.423504 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.423525 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.423747 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.424635 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.427646 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.428516 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.428630 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.429428 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.429830 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.443804 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg"] Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.543288 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.543436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.543600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.543726 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl949\" (UniqueName: \"kubernetes.io/projected/e78721da-178f-402b-a402-3709a89b744e-kube-api-access-gl949\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.645652 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.646010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.646059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl949\" (UniqueName: \"kubernetes.io/projected/e78721da-178f-402b-a402-3709a89b744e-kube-api-access-gl949\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.646121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.650055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.650409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.650807 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.664350 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl949\" (UniqueName: \"kubernetes.io/projected/e78721da-178f-402b-a402-3709a89b744e-kube-api-access-gl949\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:11 crc kubenswrapper[4909]: I1002 19:16:11.740918 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:12 crc kubenswrapper[4909]: W1002 19:16:12.340769 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode78721da_178f_402b_a402_3709a89b744e.slice/crio-6bcc817d7a4ad06825c4bb4637071adecee16e55cdfc700556cbf63753c650f3 WatchSource:0}: Error finding container 6bcc817d7a4ad06825c4bb4637071adecee16e55cdfc700556cbf63753c650f3: Status 404 returned error can't find the container with id 6bcc817d7a4ad06825c4bb4637071adecee16e55cdfc700556cbf63753c650f3 Oct 02 19:16:12 crc kubenswrapper[4909]: I1002 19:16:12.349756 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg"] Oct 02 19:16:13 crc kubenswrapper[4909]: I1002 19:16:13.349181 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" event={"ID":"e78721da-178f-402b-a402-3709a89b744e","Type":"ContainerStarted","Data":"b2ded53502da9826eaec6c1da45efb4a5e568a6b3ad36110a771bbb0bde1c0c9"} Oct 02 19:16:13 crc kubenswrapper[4909]: I1002 19:16:13.349558 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" event={"ID":"e78721da-178f-402b-a402-3709a89b744e","Type":"ContainerStarted","Data":"6bcc817d7a4ad06825c4bb4637071adecee16e55cdfc700556cbf63753c650f3"} Oct 02 19:16:13 crc kubenswrapper[4909]: I1002 19:16:13.380655 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" podStartSLOduration=1.732918367 podStartE2EDuration="2.380630205s" podCreationTimestamp="2025-10-02 19:16:11 +0000 UTC" firstStartedPulling="2025-10-02 19:16:12.34495829 +0000 UTC m=+3493.532454179" lastFinishedPulling="2025-10-02 19:16:12.992670148 +0000 UTC m=+3494.180166017" observedRunningTime="2025-10-02 19:16:13.36816895 +0000 UTC m=+3494.555664849" watchObservedRunningTime="2025-10-02 19:16:13.380630205 +0000 UTC m=+3494.568126094" Oct 02 19:16:18 crc kubenswrapper[4909]: I1002 19:16:18.421451 4909 generic.go:334] "Generic (PLEG): container finished" podID="e78721da-178f-402b-a402-3709a89b744e" containerID="b2ded53502da9826eaec6c1da45efb4a5e568a6b3ad36110a771bbb0bde1c0c9" exitCode=0 Oct 02 19:16:18 crc kubenswrapper[4909]: I1002 19:16:18.421577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" event={"ID":"e78721da-178f-402b-a402-3709a89b744e","Type":"ContainerDied","Data":"b2ded53502da9826eaec6c1da45efb4a5e568a6b3ad36110a771bbb0bde1c0c9"} Oct 02 19:16:19 crc kubenswrapper[4909]: I1002 19:16:19.990740 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.141579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ssh-key\") pod \"e78721da-178f-402b-a402-3709a89b744e\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.142017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-inventory\") pod \"e78721da-178f-402b-a402-3709a89b744e\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.142155 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl949\" (UniqueName: \"kubernetes.io/projected/e78721da-178f-402b-a402-3709a89b744e-kube-api-access-gl949\") pod \"e78721da-178f-402b-a402-3709a89b744e\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.142219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ceph\") pod \"e78721da-178f-402b-a402-3709a89b744e\" (UID: \"e78721da-178f-402b-a402-3709a89b744e\") " Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.147666 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78721da-178f-402b-a402-3709a89b744e-kube-api-access-gl949" (OuterVolumeSpecName: "kube-api-access-gl949") pod "e78721da-178f-402b-a402-3709a89b744e" (UID: "e78721da-178f-402b-a402-3709a89b744e"). InnerVolumeSpecName "kube-api-access-gl949". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.147956 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ceph" (OuterVolumeSpecName: "ceph") pod "e78721da-178f-402b-a402-3709a89b744e" (UID: "e78721da-178f-402b-a402-3709a89b744e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.180556 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-inventory" (OuterVolumeSpecName: "inventory") pod "e78721da-178f-402b-a402-3709a89b744e" (UID: "e78721da-178f-402b-a402-3709a89b744e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.196813 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e78721da-178f-402b-a402-3709a89b744e" (UID: "e78721da-178f-402b-a402-3709a89b744e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.244958 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl949\" (UniqueName: \"kubernetes.io/projected/e78721da-178f-402b-a402-3709a89b744e-kube-api-access-gl949\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.244986 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.244995 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.245003 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e78721da-178f-402b-a402-3709a89b744e-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.469222 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" event={"ID":"e78721da-178f-402b-a402-3709a89b744e","Type":"ContainerDied","Data":"6bcc817d7a4ad06825c4bb4637071adecee16e55cdfc700556cbf63753c650f3"} Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.469339 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bcc817d7a4ad06825c4bb4637071adecee16e55cdfc700556cbf63753c650f3" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.469446 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.563604 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf"] Oct 02 19:16:20 crc kubenswrapper[4909]: E1002 19:16:20.564255 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78721da-178f-402b-a402-3709a89b744e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.564280 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78721da-178f-402b-a402-3709a89b744e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.564543 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78721da-178f-402b-a402-3709a89b744e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.565460 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.569008 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.569023 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.569181 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.569798 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.570553 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.573636 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf"] Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.653640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zpzw\" (UniqueName: \"kubernetes.io/projected/7d34a713-2f9f-4d3d-9341-eb2419b2057d-kube-api-access-8zpzw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.653705 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.653898 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.653935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.756807 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.756886 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.757008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zpzw\" (UniqueName: \"kubernetes.io/projected/7d34a713-2f9f-4d3d-9341-eb2419b2057d-kube-api-access-8zpzw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.757083 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.760234 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.762972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.764539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.774170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zpzw\" (UniqueName: \"kubernetes.io/projected/7d34a713-2f9f-4d3d-9341-eb2419b2057d-kube-api-access-8zpzw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:20 crc kubenswrapper[4909]: I1002 19:16:20.886680 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:16:21 crc kubenswrapper[4909]: I1002 19:16:21.441550 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf"] Oct 02 19:16:21 crc kubenswrapper[4909]: W1002 19:16:21.441609 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d34a713_2f9f_4d3d_9341_eb2419b2057d.slice/crio-1f63f79e7951feedfcc987829026b5c07c3f20cfbc2955c583f6537cae31b8a4 WatchSource:0}: Error finding container 1f63f79e7951feedfcc987829026b5c07c3f20cfbc2955c583f6537cae31b8a4: Status 404 returned error can't find the container with id 1f63f79e7951feedfcc987829026b5c07c3f20cfbc2955c583f6537cae31b8a4 Oct 02 19:16:21 crc kubenswrapper[4909]: I1002 19:16:21.481130 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" event={"ID":"7d34a713-2f9f-4d3d-9341-eb2419b2057d","Type":"ContainerStarted","Data":"1f63f79e7951feedfcc987829026b5c07c3f20cfbc2955c583f6537cae31b8a4"} Oct 02 19:16:22 crc kubenswrapper[4909]: I1002 19:16:22.495796 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" event={"ID":"7d34a713-2f9f-4d3d-9341-eb2419b2057d","Type":"ContainerStarted","Data":"50118e36631e06a619be87491cbcb952e0b631cd4989dcfec46720d7ffc67379"} Oct 02 19:16:22 crc kubenswrapper[4909]: I1002 19:16:22.522367 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" podStartSLOduration=1.983578809 podStartE2EDuration="2.522345374s" podCreationTimestamp="2025-10-02 19:16:20 +0000 UTC" firstStartedPulling="2025-10-02 19:16:21.444922187 +0000 UTC m=+3502.632418056" lastFinishedPulling="2025-10-02 19:16:21.983688762 +0000 UTC m=+3503.171184621" observedRunningTime="2025-10-02 19:16:22.512727747 +0000 UTC m=+3503.700223646" watchObservedRunningTime="2025-10-02 19:16:22.522345374 +0000 UTC m=+3503.709841233" Oct 02 19:16:23 crc kubenswrapper[4909]: I1002 19:16:23.054722 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:16:23 crc kubenswrapper[4909]: I1002 19:16:23.054811 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.054230 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.054688 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.054732 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.055456 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b79c137b5e9368ee0f969f855be8e9f5d409f78db2ec6b15925b5fbc9ffd0f7d"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.055511 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://b79c137b5e9368ee0f969f855be8e9f5d409f78db2ec6b15925b5fbc9ffd0f7d" gracePeriod=600 Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.844172 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="b79c137b5e9368ee0f969f855be8e9f5d409f78db2ec6b15925b5fbc9ffd0f7d" exitCode=0 Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.844238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"b79c137b5e9368ee0f969f855be8e9f5d409f78db2ec6b15925b5fbc9ffd0f7d"} Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.844721 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756"} Oct 02 19:16:53 crc kubenswrapper[4909]: I1002 19:16:53.844742 4909 scope.go:117] "RemoveContainer" containerID="a6893c3c85ece0a2ac38b8922023f6385e4e068feb54c07b336820b52549152f" Oct 02 19:17:24 crc kubenswrapper[4909]: I1002 19:17:24.201215 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d34a713-2f9f-4d3d-9341-eb2419b2057d" containerID="50118e36631e06a619be87491cbcb952e0b631cd4989dcfec46720d7ffc67379" exitCode=0 Oct 02 19:17:24 crc kubenswrapper[4909]: I1002 19:17:24.201304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" event={"ID":"7d34a713-2f9f-4d3d-9341-eb2419b2057d","Type":"ContainerDied","Data":"50118e36631e06a619be87491cbcb952e0b631cd4989dcfec46720d7ffc67379"} Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.718535 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.887306 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-inventory\") pod \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.887659 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zpzw\" (UniqueName: \"kubernetes.io/projected/7d34a713-2f9f-4d3d-9341-eb2419b2057d-kube-api-access-8zpzw\") pod \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.887793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ceph\") pod \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.888005 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ssh-key\") pod \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\" (UID: \"7d34a713-2f9f-4d3d-9341-eb2419b2057d\") " Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.895745 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d34a713-2f9f-4d3d-9341-eb2419b2057d-kube-api-access-8zpzw" (OuterVolumeSpecName: "kube-api-access-8zpzw") pod "7d34a713-2f9f-4d3d-9341-eb2419b2057d" (UID: "7d34a713-2f9f-4d3d-9341-eb2419b2057d"). InnerVolumeSpecName "kube-api-access-8zpzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.896472 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ceph" (OuterVolumeSpecName: "ceph") pod "7d34a713-2f9f-4d3d-9341-eb2419b2057d" (UID: "7d34a713-2f9f-4d3d-9341-eb2419b2057d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.937879 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d34a713-2f9f-4d3d-9341-eb2419b2057d" (UID: "7d34a713-2f9f-4d3d-9341-eb2419b2057d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.949796 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-inventory" (OuterVolumeSpecName: "inventory") pod "7d34a713-2f9f-4d3d-9341-eb2419b2057d" (UID: "7d34a713-2f9f-4d3d-9341-eb2419b2057d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.991376 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.991412 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zpzw\" (UniqueName: \"kubernetes.io/projected/7d34a713-2f9f-4d3d-9341-eb2419b2057d-kube-api-access-8zpzw\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.991428 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:25 crc kubenswrapper[4909]: I1002 19:17:25.991439 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d34a713-2f9f-4d3d-9341-eb2419b2057d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.229563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" event={"ID":"7d34a713-2f9f-4d3d-9341-eb2419b2057d","Type":"ContainerDied","Data":"1f63f79e7951feedfcc987829026b5c07c3f20cfbc2955c583f6537cae31b8a4"} Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.229835 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f63f79e7951feedfcc987829026b5c07c3f20cfbc2955c583f6537cae31b8a4" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.229647 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.353707 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qxxtj"] Oct 02 19:17:26 crc kubenswrapper[4909]: E1002 19:17:26.354453 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d34a713-2f9f-4d3d-9341-eb2419b2057d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.354491 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d34a713-2f9f-4d3d-9341-eb2419b2057d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.354886 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d34a713-2f9f-4d3d-9341-eb2419b2057d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.356221 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.358916 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.359194 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.359689 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.360846 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.362180 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qxxtj"] Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.362644 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.516045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.516256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zctx9\" (UniqueName: \"kubernetes.io/projected/50a84617-a1af-4b99-a42d-d72651c450dd-kube-api-access-zctx9\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.516324 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ceph\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.516548 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.618624 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.618755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zctx9\" (UniqueName: \"kubernetes.io/projected/50a84617-a1af-4b99-a42d-d72651c450dd-kube-api-access-zctx9\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.618843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ceph\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.619226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.627759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ceph\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.627759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.629187 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.641664 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zctx9\" (UniqueName: \"kubernetes.io/projected/50a84617-a1af-4b99-a42d-d72651c450dd-kube-api-access-zctx9\") pod \"ssh-known-hosts-edpm-deployment-qxxtj\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:26 crc kubenswrapper[4909]: I1002 19:17:26.678814 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:27 crc kubenswrapper[4909]: I1002 19:17:27.298800 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qxxtj"] Oct 02 19:17:28 crc kubenswrapper[4909]: I1002 19:17:28.256865 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" event={"ID":"50a84617-a1af-4b99-a42d-d72651c450dd","Type":"ContainerStarted","Data":"027bede558060bd79ccc14fdf8ddfbcd452e0d571513fa2ab7772171e409d546"} Oct 02 19:17:28 crc kubenswrapper[4909]: I1002 19:17:28.257519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" event={"ID":"50a84617-a1af-4b99-a42d-d72651c450dd","Type":"ContainerStarted","Data":"1190bcb1376e0d6468ec858c03b856ee0b01e569e743a30fa40faea24cfe3303"} Oct 02 19:17:28 crc kubenswrapper[4909]: I1002 19:17:28.279542 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" podStartSLOduration=1.647224833 podStartE2EDuration="2.279521943s" podCreationTimestamp="2025-10-02 19:17:26 +0000 UTC" firstStartedPulling="2025-10-02 19:17:27.299390597 +0000 UTC m=+3568.486886456" lastFinishedPulling="2025-10-02 19:17:27.931687677 +0000 UTC m=+3569.119183566" observedRunningTime="2025-10-02 19:17:28.268253104 +0000 UTC m=+3569.455748963" watchObservedRunningTime="2025-10-02 19:17:28.279521943 +0000 UTC m=+3569.467017822" Oct 02 19:17:41 crc kubenswrapper[4909]: I1002 19:17:41.418768 4909 generic.go:334] "Generic (PLEG): container finished" podID="50a84617-a1af-4b99-a42d-d72651c450dd" containerID="027bede558060bd79ccc14fdf8ddfbcd452e0d571513fa2ab7772171e409d546" exitCode=0 Oct 02 19:17:41 crc kubenswrapper[4909]: I1002 19:17:41.418854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" event={"ID":"50a84617-a1af-4b99-a42d-d72651c450dd","Type":"ContainerDied","Data":"027bede558060bd79ccc14fdf8ddfbcd452e0d571513fa2ab7772171e409d546"} Oct 02 19:17:42 crc kubenswrapper[4909]: I1002 19:17:42.928437 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.008354 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zctx9\" (UniqueName: \"kubernetes.io/projected/50a84617-a1af-4b99-a42d-d72651c450dd-kube-api-access-zctx9\") pod \"50a84617-a1af-4b99-a42d-d72651c450dd\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.008465 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-inventory-0\") pod \"50a84617-a1af-4b99-a42d-d72651c450dd\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.008607 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ceph\") pod \"50a84617-a1af-4b99-a42d-d72651c450dd\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.008845 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ssh-key-openstack-edpm-ipam\") pod \"50a84617-a1af-4b99-a42d-d72651c450dd\" (UID: \"50a84617-a1af-4b99-a42d-d72651c450dd\") " Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.024726 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ceph" (OuterVolumeSpecName: "ceph") pod "50a84617-a1af-4b99-a42d-d72651c450dd" (UID: "50a84617-a1af-4b99-a42d-d72651c450dd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.024793 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a84617-a1af-4b99-a42d-d72651c450dd-kube-api-access-zctx9" (OuterVolumeSpecName: "kube-api-access-zctx9") pod "50a84617-a1af-4b99-a42d-d72651c450dd" (UID: "50a84617-a1af-4b99-a42d-d72651c450dd"). InnerVolumeSpecName "kube-api-access-zctx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.046144 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "50a84617-a1af-4b99-a42d-d72651c450dd" (UID: "50a84617-a1af-4b99-a42d-d72651c450dd"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.046575 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "50a84617-a1af-4b99-a42d-d72651c450dd" (UID: "50a84617-a1af-4b99-a42d-d72651c450dd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.112002 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zctx9\" (UniqueName: \"kubernetes.io/projected/50a84617-a1af-4b99-a42d-d72651c450dd-kube-api-access-zctx9\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.112070 4909 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.112085 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.112097 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50a84617-a1af-4b99-a42d-d72651c450dd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.445994 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" event={"ID":"50a84617-a1af-4b99-a42d-d72651c450dd","Type":"ContainerDied","Data":"1190bcb1376e0d6468ec858c03b856ee0b01e569e743a30fa40faea24cfe3303"} Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.446067 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1190bcb1376e0d6468ec858c03b856ee0b01e569e743a30fa40faea24cfe3303" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.446183 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qxxtj" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.569321 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7"] Oct 02 19:17:43 crc kubenswrapper[4909]: E1002 19:17:43.570296 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a84617-a1af-4b99-a42d-d72651c450dd" containerName="ssh-known-hosts-edpm-deployment" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.570313 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a84617-a1af-4b99-a42d-d72651c450dd" containerName="ssh-known-hosts-edpm-deployment" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.570722 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a84617-a1af-4b99-a42d-d72651c450dd" containerName="ssh-known-hosts-edpm-deployment" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.572282 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.575018 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.575402 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.575687 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.576092 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.576328 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.579792 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7"] Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.727635 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.727805 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.728481 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.728640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8js\" (UniqueName: \"kubernetes.io/projected/3b67f438-661f-476b-867c-8d1f4e742634-kube-api-access-hx8js\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.830638 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.830841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.830897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8js\" (UniqueName: \"kubernetes.io/projected/3b67f438-661f-476b-867c-8d1f4e742634-kube-api-access-hx8js\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.830944 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.837506 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.838154 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.840503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.862180 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8js\" (UniqueName: \"kubernetes.io/projected/3b67f438-661f-476b-867c-8d1f4e742634-kube-api-access-hx8js\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9m7r7\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:43 crc kubenswrapper[4909]: I1002 19:17:43.903532 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:44 crc kubenswrapper[4909]: W1002 19:17:44.482791 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b67f438_661f_476b_867c_8d1f4e742634.slice/crio-841a49e1aca22ac134401434e601b943b4b8094a4bf5b3387b6be5d7f82a7d86 WatchSource:0}: Error finding container 841a49e1aca22ac134401434e601b943b4b8094a4bf5b3387b6be5d7f82a7d86: Status 404 returned error can't find the container with id 841a49e1aca22ac134401434e601b943b4b8094a4bf5b3387b6be5d7f82a7d86 Oct 02 19:17:44 crc kubenswrapper[4909]: I1002 19:17:44.488841 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7"] Oct 02 19:17:45 crc kubenswrapper[4909]: I1002 19:17:45.475835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" event={"ID":"3b67f438-661f-476b-867c-8d1f4e742634","Type":"ContainerStarted","Data":"a565e961b4d65e193dfcfaec5d16c0ef051ef2b1f642e68c93c93b58d95016b3"} Oct 02 19:17:45 crc kubenswrapper[4909]: I1002 19:17:45.476285 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" event={"ID":"3b67f438-661f-476b-867c-8d1f4e742634","Type":"ContainerStarted","Data":"841a49e1aca22ac134401434e601b943b4b8094a4bf5b3387b6be5d7f82a7d86"} Oct 02 19:17:45 crc kubenswrapper[4909]: I1002 19:17:45.518333 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" podStartSLOduration=2.045582214 podStartE2EDuration="2.518299746s" podCreationTimestamp="2025-10-02 19:17:43 +0000 UTC" firstStartedPulling="2025-10-02 19:17:44.487352046 +0000 UTC m=+3585.674847905" lastFinishedPulling="2025-10-02 19:17:44.960069568 +0000 UTC m=+3586.147565437" observedRunningTime="2025-10-02 19:17:45.502066473 +0000 UTC m=+3586.689562342" watchObservedRunningTime="2025-10-02 19:17:45.518299746 +0000 UTC m=+3586.705795645" Oct 02 19:17:55 crc kubenswrapper[4909]: I1002 19:17:55.624297 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b67f438-661f-476b-867c-8d1f4e742634" containerID="a565e961b4d65e193dfcfaec5d16c0ef051ef2b1f642e68c93c93b58d95016b3" exitCode=0 Oct 02 19:17:55 crc kubenswrapper[4909]: I1002 19:17:55.628909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" event={"ID":"3b67f438-661f-476b-867c-8d1f4e742634","Type":"ContainerDied","Data":"a565e961b4d65e193dfcfaec5d16c0ef051ef2b1f642e68c93c93b58d95016b3"} Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.124622 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.235411 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8js\" (UniqueName: \"kubernetes.io/projected/3b67f438-661f-476b-867c-8d1f4e742634-kube-api-access-hx8js\") pod \"3b67f438-661f-476b-867c-8d1f4e742634\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.235625 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ceph\") pod \"3b67f438-661f-476b-867c-8d1f4e742634\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.235697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-inventory\") pod \"3b67f438-661f-476b-867c-8d1f4e742634\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.235735 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ssh-key\") pod \"3b67f438-661f-476b-867c-8d1f4e742634\" (UID: \"3b67f438-661f-476b-867c-8d1f4e742634\") " Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.241191 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ceph" (OuterVolumeSpecName: "ceph") pod "3b67f438-661f-476b-867c-8d1f4e742634" (UID: "3b67f438-661f-476b-867c-8d1f4e742634"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.242009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b67f438-661f-476b-867c-8d1f4e742634-kube-api-access-hx8js" (OuterVolumeSpecName: "kube-api-access-hx8js") pod "3b67f438-661f-476b-867c-8d1f4e742634" (UID: "3b67f438-661f-476b-867c-8d1f4e742634"). InnerVolumeSpecName "kube-api-access-hx8js". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.277199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-inventory" (OuterVolumeSpecName: "inventory") pod "3b67f438-661f-476b-867c-8d1f4e742634" (UID: "3b67f438-661f-476b-867c-8d1f4e742634"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.282282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3b67f438-661f-476b-867c-8d1f4e742634" (UID: "3b67f438-661f-476b-867c-8d1f4e742634"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.338729 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.338765 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.338777 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b67f438-661f-476b-867c-8d1f4e742634-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.338785 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8js\" (UniqueName: \"kubernetes.io/projected/3b67f438-661f-476b-867c-8d1f4e742634-kube-api-access-hx8js\") on node \"crc\" DevicePath \"\"" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.710183 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" event={"ID":"3b67f438-661f-476b-867c-8d1f4e742634","Type":"ContainerDied","Data":"841a49e1aca22ac134401434e601b943b4b8094a4bf5b3387b6be5d7f82a7d86"} Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.710474 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841a49e1aca22ac134401434e601b943b4b8094a4bf5b3387b6be5d7f82a7d86" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.710279 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9m7r7" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.753698 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6"] Oct 02 19:17:57 crc kubenswrapper[4909]: E1002 19:17:57.754196 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b67f438-661f-476b-867c-8d1f4e742634" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.754216 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b67f438-661f-476b-867c-8d1f4e742634" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.755244 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b67f438-661f-476b-867c-8d1f4e742634" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.756086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.759642 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.759877 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.759992 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.760244 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.760365 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.767127 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6"] Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.852703 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.853077 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tlqg\" (UniqueName: \"kubernetes.io/projected/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-kube-api-access-6tlqg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.853242 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.853729 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.955556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.955650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.955693 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tlqg\" (UniqueName: \"kubernetes.io/projected/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-kube-api-access-6tlqg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.955744 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.960709 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.961732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.962328 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:57 crc kubenswrapper[4909]: I1002 19:17:57.982149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tlqg\" (UniqueName: \"kubernetes.io/projected/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-kube-api-access-6tlqg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:58 crc kubenswrapper[4909]: I1002 19:17:58.104235 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:17:58 crc kubenswrapper[4909]: I1002 19:17:58.716361 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6"] Oct 02 19:17:58 crc kubenswrapper[4909]: I1002 19:17:58.734669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" event={"ID":"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0","Type":"ContainerStarted","Data":"91ea8ea5706289e36cc3eb1daa6fc594786038e139bfc7ea7a0fb4bfa34082d5"} Oct 02 19:17:59 crc kubenswrapper[4909]: I1002 19:17:59.744955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" event={"ID":"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0","Type":"ContainerStarted","Data":"0e2960721c134c42f868e3be6a3fd6b4ade4b1a2339fe65c76157bd89a4efe3b"} Oct 02 19:17:59 crc kubenswrapper[4909]: I1002 19:17:59.776208 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" podStartSLOduration=2.081637559 podStartE2EDuration="2.776188996s" podCreationTimestamp="2025-10-02 19:17:57 +0000 UTC" firstStartedPulling="2025-10-02 19:17:58.725361211 +0000 UTC m=+3599.912857070" lastFinishedPulling="2025-10-02 19:17:59.419912618 +0000 UTC m=+3600.607408507" observedRunningTime="2025-10-02 19:17:59.771440718 +0000 UTC m=+3600.958936597" watchObservedRunningTime="2025-10-02 19:17:59.776188996 +0000 UTC m=+3600.963684865" Oct 02 19:18:11 crc kubenswrapper[4909]: I1002 19:18:11.898827 4909 generic.go:334] "Generic (PLEG): container finished" podID="dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" containerID="0e2960721c134c42f868e3be6a3fd6b4ade4b1a2339fe65c76157bd89a4efe3b" exitCode=0 Oct 02 19:18:11 crc kubenswrapper[4909]: I1002 19:18:11.898909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" event={"ID":"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0","Type":"ContainerDied","Data":"0e2960721c134c42f868e3be6a3fd6b4ade4b1a2339fe65c76157bd89a4efe3b"} Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.417483 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.529668 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-inventory\") pod \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.529743 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ssh-key\") pod \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.529928 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ceph\") pod \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.529972 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tlqg\" (UniqueName: \"kubernetes.io/projected/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-kube-api-access-6tlqg\") pod \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\" (UID: \"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0\") " Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.535685 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ceph" (OuterVolumeSpecName: "ceph") pod "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" (UID: "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.536363 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-kube-api-access-6tlqg" (OuterVolumeSpecName: "kube-api-access-6tlqg") pod "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" (UID: "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0"). InnerVolumeSpecName "kube-api-access-6tlqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.563000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" (UID: "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.568377 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-inventory" (OuterVolumeSpecName: "inventory") pod "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" (UID: "dc36b692-f1b5-4320-9dd0-dbc68b7b5db0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.632021 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.632285 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.632351 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.632422 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tlqg\" (UniqueName: \"kubernetes.io/projected/dc36b692-f1b5-4320-9dd0-dbc68b7b5db0-kube-api-access-6tlqg\") on node \"crc\" DevicePath \"\"" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.926271 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" event={"ID":"dc36b692-f1b5-4320-9dd0-dbc68b7b5db0","Type":"ContainerDied","Data":"91ea8ea5706289e36cc3eb1daa6fc594786038e139bfc7ea7a0fb4bfa34082d5"} Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.926316 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ea8ea5706289e36cc3eb1daa6fc594786038e139bfc7ea7a0fb4bfa34082d5" Oct 02 19:18:13 crc kubenswrapper[4909]: I1002 19:18:13.926380 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.025138 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp"] Oct 02 19:18:14 crc kubenswrapper[4909]: E1002 19:18:14.025804 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.025837 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.026208 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc36b692-f1b5-4320-9dd0-dbc68b7b5db0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.027433 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.030603 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.030926 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.031190 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.031299 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.031758 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.032297 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.032665 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.032755 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.034704 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.035415 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp"] Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.036327 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.145790 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.145831 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.145864 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.145896 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.145941 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146055 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgqkz\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-kube-api-access-lgqkz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146118 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146160 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146191 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146261 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146318 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.146345 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.247982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248021 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248115 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248171 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgqkz\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-kube-api-access-lgqkz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248193 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248266 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248303 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248393 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248435 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.248482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.253551 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.254752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.255244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.255615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.255677 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.255619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.257595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.258767 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.263735 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.264054 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.264498 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.264720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.265329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.265859 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.265970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgqkz\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-kube-api-access-lgqkz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.267373 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.267814 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:14 crc kubenswrapper[4909]: I1002 19:18:14.361012 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:18:15 crc kubenswrapper[4909]: I1002 19:18:15.042835 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp"] Oct 02 19:18:15 crc kubenswrapper[4909]: I1002 19:18:15.956060 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" event={"ID":"db8ea573-a0c2-4c11-9f5b-4122524f6384","Type":"ContainerStarted","Data":"1548fb234c1e1abf3dab07089a90d9f584ec44ae81d6de8786241f595b5c4fe7"} Oct 02 19:18:15 crc kubenswrapper[4909]: I1002 19:18:15.956539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" event={"ID":"db8ea573-a0c2-4c11-9f5b-4122524f6384","Type":"ContainerStarted","Data":"fc035f49451ca8a60c23b57eaf3ca2f870efb850ac719a4afee556448c8f989a"} Oct 02 19:18:53 crc kubenswrapper[4909]: I1002 19:18:53.054561 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:18:53 crc kubenswrapper[4909]: I1002 19:18:53.055587 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:19:22 crc kubenswrapper[4909]: I1002 19:19:22.741994 4909 generic.go:334] "Generic (PLEG): container finished" podID="db8ea573-a0c2-4c11-9f5b-4122524f6384" containerID="1548fb234c1e1abf3dab07089a90d9f584ec44ae81d6de8786241f595b5c4fe7" exitCode=0 Oct 02 19:19:22 crc kubenswrapper[4909]: I1002 19:19:22.742107 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" event={"ID":"db8ea573-a0c2-4c11-9f5b-4122524f6384","Type":"ContainerDied","Data":"1548fb234c1e1abf3dab07089a90d9f584ec44ae81d6de8786241f595b5c4fe7"} Oct 02 19:19:23 crc kubenswrapper[4909]: I1002 19:19:23.054358 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:19:23 crc kubenswrapper[4909]: I1002 19:19:23.054432 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.335072 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.512565 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.512856 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ovn-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.512912 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.512936 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-neutron-metadata-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.512958 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513019 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-libvirt-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513079 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513107 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-inventory\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513129 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513174 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-repo-setup-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513195 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-power-monitoring-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513213 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-nova-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513240 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-bootstrap-combined-ca-bundle\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513266 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ceph\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513297 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-ovn-default-certs-0\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513319 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ssh-key\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.513354 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgqkz\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-kube-api-access-lgqkz\") pod \"db8ea573-a0c2-4c11-9f5b-4122524f6384\" (UID: \"db8ea573-a0c2-4c11-9f5b-4122524f6384\") " Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.520556 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.522183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.522250 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ceph" (OuterVolumeSpecName: "ceph") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.522167 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.523177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.523497 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.524583 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.541465 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-kube-api-access-lgqkz" (OuterVolumeSpecName: "kube-api-access-lgqkz") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "kube-api-access-lgqkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.541841 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.541905 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.542353 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.543617 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.544123 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.546581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.547866 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.560919 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-inventory" (OuterVolumeSpecName: "inventory") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.585287 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db8ea573-a0c2-4c11-9f5b-4122524f6384" (UID: "db8ea573-a0c2-4c11-9f5b-4122524f6384"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616199 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616229 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616241 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgqkz\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-kube-api-access-lgqkz\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616255 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616269 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616282 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616293 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616304 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616316 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616324 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616334 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616342 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/db8ea573-a0c2-4c11-9f5b-4122524f6384-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616350 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616361 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616373 4909 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616381 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.616389 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db8ea573-a0c2-4c11-9f5b-4122524f6384-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.770991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" event={"ID":"db8ea573-a0c2-4c11-9f5b-4122524f6384","Type":"ContainerDied","Data":"fc035f49451ca8a60c23b57eaf3ca2f870efb850ac719a4afee556448c8f989a"} Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.771058 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc035f49451ca8a60c23b57eaf3ca2f870efb850ac719a4afee556448c8f989a" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.771102 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.932870 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76"] Oct 02 19:19:24 crc kubenswrapper[4909]: E1002 19:19:24.933703 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8ea573-a0c2-4c11-9f5b-4122524f6384" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.933803 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8ea573-a0c2-4c11-9f5b-4122524f6384" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.934142 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8ea573-a0c2-4c11-9f5b-4122524f6384" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.936632 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.939218 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.939841 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.939965 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.940309 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.942913 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:19:24 crc kubenswrapper[4909]: I1002 19:19:24.946560 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76"] Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.125640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.125692 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.126096 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.126166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgnq\" (UniqueName: \"kubernetes.io/projected/9dae9f65-075e-408c-851a-61e7b36a99f7-kube-api-access-xbgnq\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.228001 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.228073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgnq\" (UniqueName: \"kubernetes.io/projected/9dae9f65-075e-408c-851a-61e7b36a99f7-kube-api-access-xbgnq\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.228199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.228230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.232503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.232515 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.232899 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.246963 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgnq\" (UniqueName: \"kubernetes.io/projected/9dae9f65-075e-408c-851a-61e7b36a99f7-kube-api-access-xbgnq\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.275761 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.915978 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:19:25 crc kubenswrapper[4909]: I1002 19:19:25.916286 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76"] Oct 02 19:19:26 crc kubenswrapper[4909]: I1002 19:19:26.791452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" event={"ID":"9dae9f65-075e-408c-851a-61e7b36a99f7","Type":"ContainerStarted","Data":"f691809fc5b1fa2903d60ebfd7919fe5877ffa833359ccf8a3e24edcc23a1be9"} Oct 02 19:19:27 crc kubenswrapper[4909]: I1002 19:19:27.803955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" event={"ID":"9dae9f65-075e-408c-851a-61e7b36a99f7","Type":"ContainerStarted","Data":"36628eaafddb947d910861dce352eceffddeac8d083dcb85778726b2b670dc31"} Oct 02 19:19:27 crc kubenswrapper[4909]: I1002 19:19:27.826883 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" podStartSLOduration=3.303183932 podStartE2EDuration="3.82686269s" podCreationTimestamp="2025-10-02 19:19:24 +0000 UTC" firstStartedPulling="2025-10-02 19:19:25.915391124 +0000 UTC m=+3687.102887033" lastFinishedPulling="2025-10-02 19:19:26.439069892 +0000 UTC m=+3687.626565791" observedRunningTime="2025-10-02 19:19:27.82235583 +0000 UTC m=+3689.009851739" watchObservedRunningTime="2025-10-02 19:19:27.82686269 +0000 UTC m=+3689.014358549" Oct 02 19:19:34 crc kubenswrapper[4909]: I1002 19:19:34.875046 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dae9f65-075e-408c-851a-61e7b36a99f7" containerID="36628eaafddb947d910861dce352eceffddeac8d083dcb85778726b2b670dc31" exitCode=0 Oct 02 19:19:34 crc kubenswrapper[4909]: I1002 19:19:34.875132 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" event={"ID":"9dae9f65-075e-408c-851a-61e7b36a99f7","Type":"ContainerDied","Data":"36628eaafddb947d910861dce352eceffddeac8d083dcb85778726b2b670dc31"} Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.451803 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.576635 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-inventory\") pod \"9dae9f65-075e-408c-851a-61e7b36a99f7\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.576830 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ssh-key\") pod \"9dae9f65-075e-408c-851a-61e7b36a99f7\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.576919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ceph\") pod \"9dae9f65-075e-408c-851a-61e7b36a99f7\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.576992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbgnq\" (UniqueName: \"kubernetes.io/projected/9dae9f65-075e-408c-851a-61e7b36a99f7-kube-api-access-xbgnq\") pod \"9dae9f65-075e-408c-851a-61e7b36a99f7\" (UID: \"9dae9f65-075e-408c-851a-61e7b36a99f7\") " Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.585619 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ceph" (OuterVolumeSpecName: "ceph") pod "9dae9f65-075e-408c-851a-61e7b36a99f7" (UID: "9dae9f65-075e-408c-851a-61e7b36a99f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.597537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dae9f65-075e-408c-851a-61e7b36a99f7-kube-api-access-xbgnq" (OuterVolumeSpecName: "kube-api-access-xbgnq") pod "9dae9f65-075e-408c-851a-61e7b36a99f7" (UID: "9dae9f65-075e-408c-851a-61e7b36a99f7"). InnerVolumeSpecName "kube-api-access-xbgnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.612155 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-inventory" (OuterVolumeSpecName: "inventory") pod "9dae9f65-075e-408c-851a-61e7b36a99f7" (UID: "9dae9f65-075e-408c-851a-61e7b36a99f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.619685 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9dae9f65-075e-408c-851a-61e7b36a99f7" (UID: "9dae9f65-075e-408c-851a-61e7b36a99f7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.679881 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.680609 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbgnq\" (UniqueName: \"kubernetes.io/projected/9dae9f65-075e-408c-851a-61e7b36a99f7-kube-api-access-xbgnq\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.680685 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.680741 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dae9f65-075e-408c-851a-61e7b36a99f7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.902179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" event={"ID":"9dae9f65-075e-408c-851a-61e7b36a99f7","Type":"ContainerDied","Data":"f691809fc5b1fa2903d60ebfd7919fe5877ffa833359ccf8a3e24edcc23a1be9"} Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.902415 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f691809fc5b1fa2903d60ebfd7919fe5877ffa833359ccf8a3e24edcc23a1be9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.902237 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.970837 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9"] Oct 02 19:19:36 crc kubenswrapper[4909]: E1002 19:19:36.971253 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dae9f65-075e-408c-851a-61e7b36a99f7" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.971275 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dae9f65-075e-408c-851a-61e7b36a99f7" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.971475 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dae9f65-075e-408c-851a-61e7b36a99f7" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.972144 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.975590 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.975714 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.975746 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.975768 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.976161 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.976396 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.989274 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.989313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.989358 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.989390 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69sf\" (UniqueName: \"kubernetes.io/projected/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-kube-api-access-q69sf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.989544 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:36 crc kubenswrapper[4909]: I1002 19:19:36.989568 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.029006 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9"] Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.091838 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69sf\" (UniqueName: \"kubernetes.io/projected/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-kube-api-access-q69sf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.091987 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.092015 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.092059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.092080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.092126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.093589 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.096364 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.097201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.103198 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.103421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.117065 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69sf\" (UniqueName: \"kubernetes.io/projected/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-kube-api-access-q69sf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s99c9\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.331669 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:19:37 crc kubenswrapper[4909]: I1002 19:19:37.941688 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9"] Oct 02 19:19:37 crc kubenswrapper[4909]: W1002 19:19:37.947730 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aaf1b55_e573_4c4a_b68a_7d9477b5393d.slice/crio-d99de6f82d06b0d79b609a67ef8bead42b1774ef0592f692e4023a73b7b2de77 WatchSource:0}: Error finding container d99de6f82d06b0d79b609a67ef8bead42b1774ef0592f692e4023a73b7b2de77: Status 404 returned error can't find the container with id d99de6f82d06b0d79b609a67ef8bead42b1774ef0592f692e4023a73b7b2de77 Oct 02 19:19:38 crc kubenswrapper[4909]: I1002 19:19:38.927260 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" event={"ID":"5aaf1b55-e573-4c4a-b68a-7d9477b5393d","Type":"ContainerStarted","Data":"d99de6f82d06b0d79b609a67ef8bead42b1774ef0592f692e4023a73b7b2de77"} Oct 02 19:19:39 crc kubenswrapper[4909]: I1002 19:19:39.945578 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" event={"ID":"5aaf1b55-e573-4c4a-b68a-7d9477b5393d","Type":"ContainerStarted","Data":"9bc0e1b02c7f39b58964d0717a738b64ef5ec392cc14f1984d3178a8f34db9c1"} Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.155235 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" podStartSLOduration=4.403675532 podStartE2EDuration="5.155191575s" podCreationTimestamp="2025-10-02 19:19:36 +0000 UTC" firstStartedPulling="2025-10-02 19:19:37.950116668 +0000 UTC m=+3699.137612537" lastFinishedPulling="2025-10-02 19:19:38.701632711 +0000 UTC m=+3699.889128580" observedRunningTime="2025-10-02 19:19:39.974481443 +0000 UTC m=+3701.161977332" watchObservedRunningTime="2025-10-02 19:19:41.155191575 +0000 UTC m=+3702.342687454" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.176277 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fjstg"] Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.178426 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.189746 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjstg"] Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.305539 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-utilities\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.305679 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-catalog-content\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.305711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dc4n\" (UniqueName: \"kubernetes.io/projected/501bccc8-caf8-4c0b-b9ec-381d018bddc9-kube-api-access-4dc4n\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.408550 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-utilities\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.408604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-catalog-content\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.408643 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dc4n\" (UniqueName: \"kubernetes.io/projected/501bccc8-caf8-4c0b-b9ec-381d018bddc9-kube-api-access-4dc4n\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.409157 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-utilities\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.409568 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-catalog-content\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.434946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dc4n\" (UniqueName: \"kubernetes.io/projected/501bccc8-caf8-4c0b-b9ec-381d018bddc9-kube-api-access-4dc4n\") pod \"redhat-operators-fjstg\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:41 crc kubenswrapper[4909]: I1002 19:19:41.511641 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:42 crc kubenswrapper[4909]: I1002 19:19:42.008567 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjstg"] Oct 02 19:19:42 crc kubenswrapper[4909]: I1002 19:19:42.985507 4909 generic.go:334] "Generic (PLEG): container finished" podID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerID="75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea" exitCode=0 Oct 02 19:19:42 crc kubenswrapper[4909]: I1002 19:19:42.985595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerDied","Data":"75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea"} Oct 02 19:19:42 crc kubenswrapper[4909]: I1002 19:19:42.987819 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerStarted","Data":"ee41a3319c9bb2c91d4b1e3a9b469a06bbe04b40dad01cffabe9e741dc67b831"} Oct 02 19:19:45 crc kubenswrapper[4909]: I1002 19:19:45.013729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerStarted","Data":"5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd"} Oct 02 19:19:46 crc kubenswrapper[4909]: I1002 19:19:46.030737 4909 generic.go:334] "Generic (PLEG): container finished" podID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerID="5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd" exitCode=0 Oct 02 19:19:46 crc kubenswrapper[4909]: I1002 19:19:46.031050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerDied","Data":"5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd"} Oct 02 19:19:47 crc kubenswrapper[4909]: I1002 19:19:47.049187 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerStarted","Data":"fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e"} Oct 02 19:19:47 crc kubenswrapper[4909]: I1002 19:19:47.073233 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fjstg" podStartSLOduration=2.5629664439999997 podStartE2EDuration="6.073211454s" podCreationTimestamp="2025-10-02 19:19:41 +0000 UTC" firstStartedPulling="2025-10-02 19:19:42.989449293 +0000 UTC m=+3704.176945192" lastFinishedPulling="2025-10-02 19:19:46.499694323 +0000 UTC m=+3707.687190202" observedRunningTime="2025-10-02 19:19:47.069931833 +0000 UTC m=+3708.257427702" watchObservedRunningTime="2025-10-02 19:19:47.073211454 +0000 UTC m=+3708.260707323" Oct 02 19:19:51 crc kubenswrapper[4909]: I1002 19:19:51.512319 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:51 crc kubenswrapper[4909]: I1002 19:19:51.514003 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:19:52 crc kubenswrapper[4909]: I1002 19:19:52.629474 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fjstg" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="registry-server" probeResult="failure" output=< Oct 02 19:19:52 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:19:52 crc kubenswrapper[4909]: > Oct 02 19:19:53 crc kubenswrapper[4909]: I1002 19:19:53.054810 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:19:53 crc kubenswrapper[4909]: I1002 19:19:53.055424 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:19:53 crc kubenswrapper[4909]: I1002 19:19:53.055634 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:19:53 crc kubenswrapper[4909]: I1002 19:19:53.056888 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:19:53 crc kubenswrapper[4909]: I1002 19:19:53.057234 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" gracePeriod=600 Oct 02 19:19:53 crc kubenswrapper[4909]: E1002 19:19:53.194975 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:19:54 crc kubenswrapper[4909]: I1002 19:19:54.180709 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" exitCode=0 Oct 02 19:19:54 crc kubenswrapper[4909]: I1002 19:19:54.180757 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756"} Oct 02 19:19:54 crc kubenswrapper[4909]: I1002 19:19:54.180814 4909 scope.go:117] "RemoveContainer" containerID="b79c137b5e9368ee0f969f855be8e9f5d409f78db2ec6b15925b5fbc9ffd0f7d" Oct 02 19:19:54 crc kubenswrapper[4909]: I1002 19:19:54.182161 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:19:54 crc kubenswrapper[4909]: E1002 19:19:54.182985 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:20:01 crc kubenswrapper[4909]: I1002 19:20:01.578595 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:20:01 crc kubenswrapper[4909]: I1002 19:20:01.674394 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:20:01 crc kubenswrapper[4909]: I1002 19:20:01.825384 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjstg"] Oct 02 19:20:03 crc kubenswrapper[4909]: I1002 19:20:03.308013 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fjstg" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="registry-server" containerID="cri-o://fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e" gracePeriod=2 Oct 02 19:20:03 crc kubenswrapper[4909]: I1002 19:20:03.841268 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.034265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-catalog-content\") pod \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.034501 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-utilities\") pod \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.034645 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dc4n\" (UniqueName: \"kubernetes.io/projected/501bccc8-caf8-4c0b-b9ec-381d018bddc9-kube-api-access-4dc4n\") pod \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\" (UID: \"501bccc8-caf8-4c0b-b9ec-381d018bddc9\") " Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.035596 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-utilities" (OuterVolumeSpecName: "utilities") pod "501bccc8-caf8-4c0b-b9ec-381d018bddc9" (UID: "501bccc8-caf8-4c0b-b9ec-381d018bddc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.036018 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.042189 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501bccc8-caf8-4c0b-b9ec-381d018bddc9-kube-api-access-4dc4n" (OuterVolumeSpecName: "kube-api-access-4dc4n") pod "501bccc8-caf8-4c0b-b9ec-381d018bddc9" (UID: "501bccc8-caf8-4c0b-b9ec-381d018bddc9"). InnerVolumeSpecName "kube-api-access-4dc4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.138519 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dc4n\" (UniqueName: \"kubernetes.io/projected/501bccc8-caf8-4c0b-b9ec-381d018bddc9-kube-api-access-4dc4n\") on node \"crc\" DevicePath \"\"" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.143884 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "501bccc8-caf8-4c0b-b9ec-381d018bddc9" (UID: "501bccc8-caf8-4c0b-b9ec-381d018bddc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.241925 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501bccc8-caf8-4c0b-b9ec-381d018bddc9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.320934 4909 generic.go:334] "Generic (PLEG): container finished" podID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerID="fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e" exitCode=0 Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.320986 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerDied","Data":"fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e"} Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.321006 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjstg" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.321049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjstg" event={"ID":"501bccc8-caf8-4c0b-b9ec-381d018bddc9","Type":"ContainerDied","Data":"ee41a3319c9bb2c91d4b1e3a9b469a06bbe04b40dad01cffabe9e741dc67b831"} Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.321071 4909 scope.go:117] "RemoveContainer" containerID="fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.353617 4909 scope.go:117] "RemoveContainer" containerID="5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.380888 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjstg"] Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.399528 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fjstg"] Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.403009 4909 scope.go:117] "RemoveContainer" containerID="75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.484731 4909 scope.go:117] "RemoveContainer" containerID="fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e" Oct 02 19:20:04 crc kubenswrapper[4909]: E1002 19:20:04.486073 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e\": container with ID starting with fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e not found: ID does not exist" containerID="fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.486133 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e"} err="failed to get container status \"fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e\": rpc error: code = NotFound desc = could not find container \"fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e\": container with ID starting with fb4ca08596e88f0ff507c65b86d220cfeb02686def9e7e4ee2ac4f25327f215e not found: ID does not exist" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.486172 4909 scope.go:117] "RemoveContainer" containerID="5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd" Oct 02 19:20:04 crc kubenswrapper[4909]: E1002 19:20:04.486729 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd\": container with ID starting with 5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd not found: ID does not exist" containerID="5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.486762 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd"} err="failed to get container status \"5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd\": rpc error: code = NotFound desc = could not find container \"5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd\": container with ID starting with 5b86bdc39d119ef99d9c8fc614063b1676681e3297fb334c070346baa1c79cfd not found: ID does not exist" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.486784 4909 scope.go:117] "RemoveContainer" containerID="75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea" Oct 02 19:20:04 crc kubenswrapper[4909]: E1002 19:20:04.487250 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea\": container with ID starting with 75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea not found: ID does not exist" containerID="75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea" Oct 02 19:20:04 crc kubenswrapper[4909]: I1002 19:20:04.487273 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea"} err="failed to get container status \"75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea\": rpc error: code = NotFound desc = could not find container \"75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea\": container with ID starting with 75ce56ced93cddfe55a4e67af165d98eb8cf460a5b3d237949df211b9c9935ea not found: ID does not exist" Oct 02 19:20:05 crc kubenswrapper[4909]: I1002 19:20:05.623997 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" path="/var/lib/kubelet/pods/501bccc8-caf8-4c0b-b9ec-381d018bddc9/volumes" Oct 02 19:20:07 crc kubenswrapper[4909]: I1002 19:20:07.609045 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:20:07 crc kubenswrapper[4909]: E1002 19:20:07.609595 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:20:22 crc kubenswrapper[4909]: I1002 19:20:22.609774 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:20:22 crc kubenswrapper[4909]: E1002 19:20:22.613476 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:20:35 crc kubenswrapper[4909]: I1002 19:20:35.609126 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:20:35 crc kubenswrapper[4909]: E1002 19:20:35.610055 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:20:49 crc kubenswrapper[4909]: I1002 19:20:49.621522 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:20:49 crc kubenswrapper[4909]: E1002 19:20:49.622734 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:21:02 crc kubenswrapper[4909]: I1002 19:21:02.609185 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:21:02 crc kubenswrapper[4909]: E1002 19:21:02.609829 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:21:06 crc kubenswrapper[4909]: I1002 19:21:06.040282 4909 generic.go:334] "Generic (PLEG): container finished" podID="5aaf1b55-e573-4c4a-b68a-7d9477b5393d" containerID="9bc0e1b02c7f39b58964d0717a738b64ef5ec392cc14f1984d3178a8f34db9c1" exitCode=0 Oct 02 19:21:06 crc kubenswrapper[4909]: I1002 19:21:06.040926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" event={"ID":"5aaf1b55-e573-4c4a-b68a-7d9477b5393d","Type":"ContainerDied","Data":"9bc0e1b02c7f39b58964d0717a738b64ef5ec392cc14f1984d3178a8f34db9c1"} Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.580392 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.729136 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ceph\") pod \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.729577 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69sf\" (UniqueName: \"kubernetes.io/projected/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-kube-api-access-q69sf\") pod \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.729651 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-inventory\") pod \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.729740 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovncontroller-config-0\") pod \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.729782 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovn-combined-ca-bundle\") pod \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.729852 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ssh-key\") pod \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\" (UID: \"5aaf1b55-e573-4c4a-b68a-7d9477b5393d\") " Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.735111 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ceph" (OuterVolumeSpecName: "ceph") pod "5aaf1b55-e573-4c4a-b68a-7d9477b5393d" (UID: "5aaf1b55-e573-4c4a-b68a-7d9477b5393d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.736051 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-kube-api-access-q69sf" (OuterVolumeSpecName: "kube-api-access-q69sf") pod "5aaf1b55-e573-4c4a-b68a-7d9477b5393d" (UID: "5aaf1b55-e573-4c4a-b68a-7d9477b5393d"). InnerVolumeSpecName "kube-api-access-q69sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.736719 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5aaf1b55-e573-4c4a-b68a-7d9477b5393d" (UID: "5aaf1b55-e573-4c4a-b68a-7d9477b5393d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.768085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-inventory" (OuterVolumeSpecName: "inventory") pod "5aaf1b55-e573-4c4a-b68a-7d9477b5393d" (UID: "5aaf1b55-e573-4c4a-b68a-7d9477b5393d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.778600 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5aaf1b55-e573-4c4a-b68a-7d9477b5393d" (UID: "5aaf1b55-e573-4c4a-b68a-7d9477b5393d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.781448 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5aaf1b55-e573-4c4a-b68a-7d9477b5393d" (UID: "5aaf1b55-e573-4c4a-b68a-7d9477b5393d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.832899 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.832934 4909 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.832947 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.832960 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.832973 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:21:07 crc kubenswrapper[4909]: I1002 19:21:07.832983 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q69sf\" (UniqueName: \"kubernetes.io/projected/5aaf1b55-e573-4c4a-b68a-7d9477b5393d-kube-api-access-q69sf\") on node \"crc\" DevicePath \"\"" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.069767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" event={"ID":"5aaf1b55-e573-4c4a-b68a-7d9477b5393d","Type":"ContainerDied","Data":"d99de6f82d06b0d79b609a67ef8bead42b1774ef0592f692e4023a73b7b2de77"} Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.069818 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d99de6f82d06b0d79b609a67ef8bead42b1774ef0592f692e4023a73b7b2de77" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.069837 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s99c9" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.186689 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq"] Oct 02 19:21:08 crc kubenswrapper[4909]: E1002 19:21:08.187510 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aaf1b55-e573-4c4a-b68a-7d9477b5393d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.187539 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aaf1b55-e573-4c4a-b68a-7d9477b5393d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 19:21:08 crc kubenswrapper[4909]: E1002 19:21:08.187565 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="extract-utilities" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.187573 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="extract-utilities" Oct 02 19:21:08 crc kubenswrapper[4909]: E1002 19:21:08.187586 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="extract-content" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.187594 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="extract-content" Oct 02 19:21:08 crc kubenswrapper[4909]: E1002 19:21:08.187615 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="registry-server" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.187623 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="registry-server" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.187886 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aaf1b55-e573-4c4a-b68a-7d9477b5393d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.187951 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="501bccc8-caf8-4c0b-b9ec-381d018bddc9" containerName="registry-server" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.188827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.193976 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.193976 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.194113 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.197809 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.198168 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.198212 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.199695 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.206774 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq"] Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.344274 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.344362 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.344473 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.344601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jp6\" (UniqueName: \"kubernetes.io/projected/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-kube-api-access-r7jp6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.344768 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.344909 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.345107 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447125 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447221 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447270 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447358 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jp6\" (UniqueName: \"kubernetes.io/projected/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-kube-api-access-r7jp6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447430 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.447573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.452687 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.452834 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.453821 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.453973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.458201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.458278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.468294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jp6\" (UniqueName: \"kubernetes.io/projected/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-kube-api-access-r7jp6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:08 crc kubenswrapper[4909]: I1002 19:21:08.509593 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:21:09 crc kubenswrapper[4909]: I1002 19:21:09.072407 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq"] Oct 02 19:21:10 crc kubenswrapper[4909]: I1002 19:21:10.098162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" event={"ID":"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7","Type":"ContainerStarted","Data":"35d4d57bb52eacae4e37e16c29978eabcaf0dbc92892ae777dd336448c7b0a91"} Oct 02 19:21:10 crc kubenswrapper[4909]: I1002 19:21:10.098519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" event={"ID":"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7","Type":"ContainerStarted","Data":"788a9cad26bcee6854ad27a0fbe8a15ba5ca55b4a9a260ebc4d3df86af93c9e0"} Oct 02 19:21:10 crc kubenswrapper[4909]: I1002 19:21:10.123441 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" podStartSLOduration=1.595994525 podStartE2EDuration="2.123419846s" podCreationTimestamp="2025-10-02 19:21:08 +0000 UTC" firstStartedPulling="2025-10-02 19:21:09.080113395 +0000 UTC m=+3790.267609254" lastFinishedPulling="2025-10-02 19:21:09.607538676 +0000 UTC m=+3790.795034575" observedRunningTime="2025-10-02 19:21:10.122345363 +0000 UTC m=+3791.309841232" watchObservedRunningTime="2025-10-02 19:21:10.123419846 +0000 UTC m=+3791.310915705" Oct 02 19:21:14 crc kubenswrapper[4909]: I1002 19:21:14.609863 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:21:14 crc kubenswrapper[4909]: E1002 19:21:14.611263 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:21:25 crc kubenswrapper[4909]: I1002 19:21:25.608545 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:21:25 crc kubenswrapper[4909]: E1002 19:21:25.609682 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:21:39 crc kubenswrapper[4909]: I1002 19:21:39.625421 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:21:39 crc kubenswrapper[4909]: E1002 19:21:39.626257 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:21:51 crc kubenswrapper[4909]: I1002 19:21:51.609360 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:21:51 crc kubenswrapper[4909]: E1002 19:21:51.610386 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:22:03 crc kubenswrapper[4909]: I1002 19:22:03.609249 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:22:03 crc kubenswrapper[4909]: E1002 19:22:03.610695 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:22:18 crc kubenswrapper[4909]: I1002 19:22:18.608659 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:22:18 crc kubenswrapper[4909]: E1002 19:22:18.609514 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:22:30 crc kubenswrapper[4909]: I1002 19:22:30.608941 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:22:30 crc kubenswrapper[4909]: E1002 19:22:30.610325 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:22:31 crc kubenswrapper[4909]: I1002 19:22:31.110186 4909 generic.go:334] "Generic (PLEG): container finished" podID="c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" containerID="35d4d57bb52eacae4e37e16c29978eabcaf0dbc92892ae777dd336448c7b0a91" exitCode=0 Oct 02 19:22:31 crc kubenswrapper[4909]: I1002 19:22:31.110231 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" event={"ID":"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7","Type":"ContainerDied","Data":"35d4d57bb52eacae4e37e16c29978eabcaf0dbc92892ae777dd336448c7b0a91"} Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.565876 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.722691 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.722824 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ssh-key\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.722967 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-inventory\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.723010 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ceph\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.723098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-nova-metadata-neutron-config-0\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.723126 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-metadata-combined-ca-bundle\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.723309 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7jp6\" (UniqueName: \"kubernetes.io/projected/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-kube-api-access-r7jp6\") pod \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\" (UID: \"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7\") " Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.729614 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ceph" (OuterVolumeSpecName: "ceph") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.730983 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.731690 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-kube-api-access-r7jp6" (OuterVolumeSpecName: "kube-api-access-r7jp6") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "kube-api-access-r7jp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.754992 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.756936 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.770175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.770572 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-inventory" (OuterVolumeSpecName: "inventory") pod "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" (UID: "c5ea6385-4b7d-43d2-9666-f87c8aeab5e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.826871 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7jp6\" (UniqueName: \"kubernetes.io/projected/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-kube-api-access-r7jp6\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.827455 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.827493 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.827514 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.827529 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.827545 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:32 crc kubenswrapper[4909]: I1002 19:22:32.827563 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea6385-4b7d-43d2-9666-f87c8aeab5e7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.134243 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" event={"ID":"c5ea6385-4b7d-43d2-9666-f87c8aeab5e7","Type":"ContainerDied","Data":"788a9cad26bcee6854ad27a0fbe8a15ba5ca55b4a9a260ebc4d3df86af93c9e0"} Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.134301 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="788a9cad26bcee6854ad27a0fbe8a15ba5ca55b4a9a260ebc4d3df86af93c9e0" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.134397 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.264280 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz"] Oct 02 19:22:33 crc kubenswrapper[4909]: E1002 19:22:33.265048 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.265067 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.265275 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ea6385-4b7d-43d2-9666-f87c8aeab5e7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.265969 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.272467 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.272734 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.272914 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.273062 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.273547 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.273705 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.294358 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz"] Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.439590 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.439758 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.439861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xnz\" (UniqueName: \"kubernetes.io/projected/e917646f-ee0c-442e-8a71-d637ef36a45e-kube-api-access-n7xnz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.439931 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.440013 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.440061 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.541324 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.541416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7xnz\" (UniqueName: \"kubernetes.io/projected/e917646f-ee0c-442e-8a71-d637ef36a45e-kube-api-access-n7xnz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.541476 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.541537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.541562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.541610 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.548388 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.548715 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.548902 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.549571 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.550196 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.565534 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7xnz\" (UniqueName: \"kubernetes.io/projected/e917646f-ee0c-442e-8a71-d637ef36a45e-kube-api-access-n7xnz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:33 crc kubenswrapper[4909]: I1002 19:22:33.590869 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:22:34 crc kubenswrapper[4909]: I1002 19:22:34.180693 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz"] Oct 02 19:22:35 crc kubenswrapper[4909]: I1002 19:22:35.158559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" event={"ID":"e917646f-ee0c-442e-8a71-d637ef36a45e","Type":"ContainerStarted","Data":"83f267429cc7ca065d992541443cedcdc63349052402aa7b14f2cbb165558826"} Oct 02 19:22:35 crc kubenswrapper[4909]: I1002 19:22:35.159179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" event={"ID":"e917646f-ee0c-442e-8a71-d637ef36a45e","Type":"ContainerStarted","Data":"d3f7ec1649ce312ad9343d27e351004de634b0bf2d73cfee196b9314441ba97b"} Oct 02 19:22:35 crc kubenswrapper[4909]: I1002 19:22:35.183894 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" podStartSLOduration=1.592546262 podStartE2EDuration="2.183866047s" podCreationTimestamp="2025-10-02 19:22:33 +0000 UTC" firstStartedPulling="2025-10-02 19:22:34.189755505 +0000 UTC m=+3875.377251374" lastFinishedPulling="2025-10-02 19:22:34.7810753 +0000 UTC m=+3875.968571159" observedRunningTime="2025-10-02 19:22:35.177557041 +0000 UTC m=+3876.365052930" watchObservedRunningTime="2025-10-02 19:22:35.183866047 +0000 UTC m=+3876.371361916" Oct 02 19:22:44 crc kubenswrapper[4909]: I1002 19:22:44.610390 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:22:44 crc kubenswrapper[4909]: E1002 19:22:44.611592 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:22:55 crc kubenswrapper[4909]: I1002 19:22:55.609414 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:22:55 crc kubenswrapper[4909]: E1002 19:22:55.610356 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:23:06 crc kubenswrapper[4909]: I1002 19:23:06.609223 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:23:06 crc kubenswrapper[4909]: E1002 19:23:06.610278 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:23:18 crc kubenswrapper[4909]: I1002 19:23:18.608645 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:23:18 crc kubenswrapper[4909]: E1002 19:23:18.609688 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:23:31 crc kubenswrapper[4909]: I1002 19:23:31.611551 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:23:31 crc kubenswrapper[4909]: E1002 19:23:31.612460 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:23:46 crc kubenswrapper[4909]: I1002 19:23:46.608367 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:23:46 crc kubenswrapper[4909]: E1002 19:23:46.609104 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:24:00 crc kubenswrapper[4909]: I1002 19:24:00.609105 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:24:00 crc kubenswrapper[4909]: E1002 19:24:00.610207 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:24:13 crc kubenswrapper[4909]: I1002 19:24:13.608920 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:24:13 crc kubenswrapper[4909]: E1002 19:24:13.610222 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:24:28 crc kubenswrapper[4909]: I1002 19:24:28.609123 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:24:28 crc kubenswrapper[4909]: E1002 19:24:28.610308 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:24:42 crc kubenswrapper[4909]: I1002 19:24:42.611786 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:24:42 crc kubenswrapper[4909]: E1002 19:24:42.613424 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:24:55 crc kubenswrapper[4909]: I1002 19:24:55.608297 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:24:56 crc kubenswrapper[4909]: I1002 19:24:56.863580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"c0210316922fb9936b7e40719074195ee6f92c8eb0ba0771c217dadc7636625b"} Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.570353 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7pvkl"] Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.573240 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.581192 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7pvkl"] Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.718606 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-catalog-content\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.718697 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-utilities\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.718980 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wmg\" (UniqueName: \"kubernetes.io/projected/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-kube-api-access-g7wmg\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.821284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-catalog-content\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.821577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-utilities\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.821645 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wmg\" (UniqueName: \"kubernetes.io/projected/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-kube-api-access-g7wmg\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.821888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-catalog-content\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.822137 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-utilities\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.841933 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wmg\" (UniqueName: \"kubernetes.io/projected/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-kube-api-access-g7wmg\") pod \"certified-operators-7pvkl\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:22 crc kubenswrapper[4909]: I1002 19:25:22.897572 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:23 crc kubenswrapper[4909]: I1002 19:25:23.435536 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7pvkl"] Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.217533 4909 generic.go:334] "Generic (PLEG): container finished" podID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerID="25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3" exitCode=0 Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.217641 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerDied","Data":"25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3"} Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.217814 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerStarted","Data":"8d94d0c86a521b85b1015e81484d50a533b8884cd3d6260bf538ad2a9579e717"} Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.219630 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.953920 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5t5"] Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.958151 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:24 crc kubenswrapper[4909]: I1002 19:25:24.980956 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5t5"] Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.071976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-catalog-content\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.072090 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6rj\" (UniqueName: \"kubernetes.io/projected/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-kube-api-access-2h6rj\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.072124 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-utilities\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.149977 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6hbk5"] Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.152481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.163299 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hbk5"] Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.173565 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-catalog-content\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.173639 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6rj\" (UniqueName: \"kubernetes.io/projected/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-kube-api-access-2h6rj\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.173665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-utilities\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.174194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-utilities\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.174405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-catalog-content\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.225928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6rj\" (UniqueName: \"kubernetes.io/projected/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-kube-api-access-2h6rj\") pod \"redhat-marketplace-9b5t5\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.276192 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-utilities\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.276447 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxkvw\" (UniqueName: \"kubernetes.io/projected/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-kube-api-access-vxkvw\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.277156 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-catalog-content\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.288884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.379548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-utilities\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.379883 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxkvw\" (UniqueName: \"kubernetes.io/projected/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-kube-api-access-vxkvw\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.380053 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-catalog-content\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.380486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-utilities\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.380548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-catalog-content\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.413910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxkvw\" (UniqueName: \"kubernetes.io/projected/ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd-kube-api-access-vxkvw\") pod \"community-operators-6hbk5\" (UID: \"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd\") " pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.473495 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:25 crc kubenswrapper[4909]: I1002 19:25:25.939734 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5t5"] Oct 02 19:25:25 crc kubenswrapper[4909]: W1002 19:25:25.945858 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod200a73a4_7b3d_4b70_aff7_1e8511a3a81a.slice/crio-9ba4f4e6496c3e56d80f178e5baa0e59cd20073341ac22e4eb5ffa0e02c428ac WatchSource:0}: Error finding container 9ba4f4e6496c3e56d80f178e5baa0e59cd20073341ac22e4eb5ffa0e02c428ac: Status 404 returned error can't find the container with id 9ba4f4e6496c3e56d80f178e5baa0e59cd20073341ac22e4eb5ffa0e02c428ac Oct 02 19:25:26 crc kubenswrapper[4909]: I1002 19:25:26.108042 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hbk5"] Oct 02 19:25:26 crc kubenswrapper[4909]: I1002 19:25:26.236806 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbk5" event={"ID":"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd","Type":"ContainerStarted","Data":"2028b9e73f5f2edc54d31b080ef797f9009c4242b611c83803e3a4c786cd9f17"} Oct 02 19:25:26 crc kubenswrapper[4909]: I1002 19:25:26.239469 4909 generic.go:334] "Generic (PLEG): container finished" podID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerID="5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59" exitCode=0 Oct 02 19:25:26 crc kubenswrapper[4909]: I1002 19:25:26.239530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerDied","Data":"5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59"} Oct 02 19:25:26 crc kubenswrapper[4909]: I1002 19:25:26.239737 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerStarted","Data":"9ba4f4e6496c3e56d80f178e5baa0e59cd20073341ac22e4eb5ffa0e02c428ac"} Oct 02 19:25:26 crc kubenswrapper[4909]: I1002 19:25:26.246334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerStarted","Data":"91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35"} Oct 02 19:25:27 crc kubenswrapper[4909]: I1002 19:25:27.258483 4909 generic.go:334] "Generic (PLEG): container finished" podID="ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd" containerID="d3fa5696e5d1610f2652c1bfee1de9742b8c1f7ccdc024058b49f09adcc2bc73" exitCode=0 Oct 02 19:25:27 crc kubenswrapper[4909]: I1002 19:25:27.258528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbk5" event={"ID":"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd","Type":"ContainerDied","Data":"d3fa5696e5d1610f2652c1bfee1de9742b8c1f7ccdc024058b49f09adcc2bc73"} Oct 02 19:25:27 crc kubenswrapper[4909]: I1002 19:25:27.261324 4909 generic.go:334] "Generic (PLEG): container finished" podID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerID="91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35" exitCode=0 Oct 02 19:25:27 crc kubenswrapper[4909]: I1002 19:25:27.261362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerDied","Data":"91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35"} Oct 02 19:25:28 crc kubenswrapper[4909]: I1002 19:25:28.280879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerStarted","Data":"532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28"} Oct 02 19:25:29 crc kubenswrapper[4909]: I1002 19:25:29.294443 4909 generic.go:334] "Generic (PLEG): container finished" podID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerID="532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28" exitCode=0 Oct 02 19:25:29 crc kubenswrapper[4909]: I1002 19:25:29.294523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerDied","Data":"532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28"} Oct 02 19:25:29 crc kubenswrapper[4909]: I1002 19:25:29.299158 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerStarted","Data":"eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0"} Oct 02 19:25:29 crc kubenswrapper[4909]: I1002 19:25:29.353803 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7pvkl" podStartSLOduration=3.649598372 podStartE2EDuration="7.35378453s" podCreationTimestamp="2025-10-02 19:25:22 +0000 UTC" firstStartedPulling="2025-10-02 19:25:24.219383178 +0000 UTC m=+4045.406879037" lastFinishedPulling="2025-10-02 19:25:27.923569326 +0000 UTC m=+4049.111065195" observedRunningTime="2025-10-02 19:25:29.342338164 +0000 UTC m=+4050.529834033" watchObservedRunningTime="2025-10-02 19:25:29.35378453 +0000 UTC m=+4050.541280389" Oct 02 19:25:32 crc kubenswrapper[4909]: E1002 19:25:32.828452 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab21d0da_ea4e_46e7_a9b7_cd0f81b3dddd.slice/crio-4c075f3a8d32d248a4f4ba6508a08744449c0d6491e638416197600fff390391.scope\": RecentStats: unable to find data in memory cache]" Oct 02 19:25:32 crc kubenswrapper[4909]: I1002 19:25:32.897797 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:32 crc kubenswrapper[4909]: I1002 19:25:32.898526 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:33 crc kubenswrapper[4909]: I1002 19:25:33.348964 4909 generic.go:334] "Generic (PLEG): container finished" podID="ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd" containerID="4c075f3a8d32d248a4f4ba6508a08744449c0d6491e638416197600fff390391" exitCode=0 Oct 02 19:25:33 crc kubenswrapper[4909]: I1002 19:25:33.349120 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbk5" event={"ID":"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd","Type":"ContainerDied","Data":"4c075f3a8d32d248a4f4ba6508a08744449c0d6491e638416197600fff390391"} Oct 02 19:25:33 crc kubenswrapper[4909]: I1002 19:25:33.354754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerStarted","Data":"6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e"} Oct 02 19:25:33 crc kubenswrapper[4909]: I1002 19:25:33.403853 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9b5t5" podStartSLOduration=3.498659151 podStartE2EDuration="9.403827763s" podCreationTimestamp="2025-10-02 19:25:24 +0000 UTC" firstStartedPulling="2025-10-02 19:25:26.24165309 +0000 UTC m=+4047.429148959" lastFinishedPulling="2025-10-02 19:25:32.146821712 +0000 UTC m=+4053.334317571" observedRunningTime="2025-10-02 19:25:33.400157699 +0000 UTC m=+4054.587653558" watchObservedRunningTime="2025-10-02 19:25:33.403827763 +0000 UTC m=+4054.591323652" Oct 02 19:25:33 crc kubenswrapper[4909]: I1002 19:25:33.955304 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7pvkl" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="registry-server" probeResult="failure" output=< Oct 02 19:25:33 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:25:33 crc kubenswrapper[4909]: > Oct 02 19:25:34 crc kubenswrapper[4909]: I1002 19:25:34.368829 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbk5" event={"ID":"ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd","Type":"ContainerStarted","Data":"add6cadc181942c18e19c0253268d7fef514ae0d6a1f0cd97185396c00b7486d"} Oct 02 19:25:34 crc kubenswrapper[4909]: I1002 19:25:34.393695 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6hbk5" podStartSLOduration=2.85935253 podStartE2EDuration="9.393679533s" podCreationTimestamp="2025-10-02 19:25:25 +0000 UTC" firstStartedPulling="2025-10-02 19:25:27.261044747 +0000 UTC m=+4048.448540626" lastFinishedPulling="2025-10-02 19:25:33.79537176 +0000 UTC m=+4054.982867629" observedRunningTime="2025-10-02 19:25:34.392505007 +0000 UTC m=+4055.580000866" watchObservedRunningTime="2025-10-02 19:25:34.393679533 +0000 UTC m=+4055.581175392" Oct 02 19:25:35 crc kubenswrapper[4909]: I1002 19:25:35.289999 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:35 crc kubenswrapper[4909]: I1002 19:25:35.290062 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:35 crc kubenswrapper[4909]: I1002 19:25:35.341922 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:35 crc kubenswrapper[4909]: I1002 19:25:35.473907 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:35 crc kubenswrapper[4909]: I1002 19:25:35.474063 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:36 crc kubenswrapper[4909]: I1002 19:25:36.530153 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6hbk5" podUID="ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd" containerName="registry-server" probeResult="failure" output=< Oct 02 19:25:36 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:25:36 crc kubenswrapper[4909]: > Oct 02 19:25:43 crc kubenswrapper[4909]: I1002 19:25:43.014533 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:43 crc kubenswrapper[4909]: I1002 19:25:43.173350 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:43 crc kubenswrapper[4909]: I1002 19:25:43.262726 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7pvkl"] Oct 02 19:25:44 crc kubenswrapper[4909]: I1002 19:25:44.500493 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7pvkl" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="registry-server" containerID="cri-o://eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0" gracePeriod=2 Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.100737 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.238729 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-catalog-content\") pod \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.238817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-utilities\") pod \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.238960 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7wmg\" (UniqueName: \"kubernetes.io/projected/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-kube-api-access-g7wmg\") pod \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\" (UID: \"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71\") " Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.240443 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-utilities" (OuterVolumeSpecName: "utilities") pod "0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" (UID: "0f389326-bf5f-4f35-9a9d-fcc41b3c5c71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.247904 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-kube-api-access-g7wmg" (OuterVolumeSpecName: "kube-api-access-g7wmg") pod "0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" (UID: "0f389326-bf5f-4f35-9a9d-fcc41b3c5c71"). InnerVolumeSpecName "kube-api-access-g7wmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.308403 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" (UID: "0f389326-bf5f-4f35-9a9d-fcc41b3c5c71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.341908 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.341949 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.341964 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7wmg\" (UniqueName: \"kubernetes.io/projected/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71-kube-api-access-g7wmg\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.362849 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.513905 4909 generic.go:334] "Generic (PLEG): container finished" podID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerID="eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0" exitCode=0 Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.513955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerDied","Data":"eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0"} Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.513987 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pvkl" event={"ID":"0f389326-bf5f-4f35-9a9d-fcc41b3c5c71","Type":"ContainerDied","Data":"8d94d0c86a521b85b1015e81484d50a533b8884cd3d6260bf538ad2a9579e717"} Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.513983 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pvkl" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.514001 4909 scope.go:117] "RemoveContainer" containerID="eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.535513 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.565770 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7pvkl"] Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.575761 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7pvkl"] Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.584951 4909 scope.go:117] "RemoveContainer" containerID="91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.602663 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6hbk5" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.623101 4909 scope.go:117] "RemoveContainer" containerID="25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.633696 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" path="/var/lib/kubelet/pods/0f389326-bf5f-4f35-9a9d-fcc41b3c5c71/volumes" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.693821 4909 scope.go:117] "RemoveContainer" containerID="eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0" Oct 02 19:25:45 crc kubenswrapper[4909]: E1002 19:25:45.694485 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0\": container with ID starting with eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0 not found: ID does not exist" containerID="eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.694581 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0"} err="failed to get container status \"eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0\": rpc error: code = NotFound desc = could not find container \"eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0\": container with ID starting with eb43d4caf07f419549f3176dbb3a921a2a219811bbb36591e5de32dba4bea5a0 not found: ID does not exist" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.694615 4909 scope.go:117] "RemoveContainer" containerID="91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35" Oct 02 19:25:45 crc kubenswrapper[4909]: E1002 19:25:45.695114 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35\": container with ID starting with 91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35 not found: ID does not exist" containerID="91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.695156 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35"} err="failed to get container status \"91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35\": rpc error: code = NotFound desc = could not find container \"91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35\": container with ID starting with 91d1952c998e65f59691fafac3d6affaf23a1f7fa4c6bc3087e000e46fb0cb35 not found: ID does not exist" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.695184 4909 scope.go:117] "RemoveContainer" containerID="25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3" Oct 02 19:25:45 crc kubenswrapper[4909]: E1002 19:25:45.695472 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3\": container with ID starting with 25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3 not found: ID does not exist" containerID="25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3" Oct 02 19:25:45 crc kubenswrapper[4909]: I1002 19:25:45.695500 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3"} err="failed to get container status \"25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3\": rpc error: code = NotFound desc = could not find container \"25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3\": container with ID starting with 25e36a5cb32351950d3e28d95ae77e6d179fbed95b1a958059ec9cf5aa28abf3 not found: ID does not exist" Oct 02 19:25:47 crc kubenswrapper[4909]: I1002 19:25:47.668573 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5t5"] Oct 02 19:25:47 crc kubenswrapper[4909]: I1002 19:25:47.669174 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9b5t5" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="registry-server" containerID="cri-o://6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e" gracePeriod=2 Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.255690 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.419119 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-utilities\") pod \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.419278 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6rj\" (UniqueName: \"kubernetes.io/projected/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-kube-api-access-2h6rj\") pod \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.419358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-catalog-content\") pod \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\" (UID: \"200a73a4-7b3d-4b70-aff7-1e8511a3a81a\") " Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.420933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-utilities" (OuterVolumeSpecName: "utilities") pod "200a73a4-7b3d-4b70-aff7-1e8511a3a81a" (UID: "200a73a4-7b3d-4b70-aff7-1e8511a3a81a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.428365 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-kube-api-access-2h6rj" (OuterVolumeSpecName: "kube-api-access-2h6rj") pod "200a73a4-7b3d-4b70-aff7-1e8511a3a81a" (UID: "200a73a4-7b3d-4b70-aff7-1e8511a3a81a"). InnerVolumeSpecName "kube-api-access-2h6rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.439139 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "200a73a4-7b3d-4b70-aff7-1e8511a3a81a" (UID: "200a73a4-7b3d-4b70-aff7-1e8511a3a81a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.521930 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.521966 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h6rj\" (UniqueName: \"kubernetes.io/projected/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-kube-api-access-2h6rj\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.521980 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200a73a4-7b3d-4b70-aff7-1e8511a3a81a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.552273 4909 generic.go:334] "Generic (PLEG): container finished" podID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerID="6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e" exitCode=0 Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.552322 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerDied","Data":"6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e"} Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.552352 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b5t5" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.552364 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5t5" event={"ID":"200a73a4-7b3d-4b70-aff7-1e8511a3a81a","Type":"ContainerDied","Data":"9ba4f4e6496c3e56d80f178e5baa0e59cd20073341ac22e4eb5ffa0e02c428ac"} Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.552386 4909 scope.go:117] "RemoveContainer" containerID="6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.594216 4909 scope.go:117] "RemoveContainer" containerID="532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.613112 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5t5"] Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.630100 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5t5"] Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.633597 4909 scope.go:117] "RemoveContainer" containerID="5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.699577 4909 scope.go:117] "RemoveContainer" containerID="6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e" Oct 02 19:25:48 crc kubenswrapper[4909]: E1002 19:25:48.700320 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e\": container with ID starting with 6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e not found: ID does not exist" containerID="6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.700378 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e"} err="failed to get container status \"6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e\": rpc error: code = NotFound desc = could not find container \"6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e\": container with ID starting with 6de7526cdfb4d61eea4bc590cf9523d76efb66ae916020ef5d011a8c1313cd0e not found: ID does not exist" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.700407 4909 scope.go:117] "RemoveContainer" containerID="532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28" Oct 02 19:25:48 crc kubenswrapper[4909]: E1002 19:25:48.700848 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28\": container with ID starting with 532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28 not found: ID does not exist" containerID="532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.700881 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28"} err="failed to get container status \"532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28\": rpc error: code = NotFound desc = could not find container \"532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28\": container with ID starting with 532165ed1139c9a93dc5c656a131f02ec8906a2a30daea3b6015d0cb48148c28 not found: ID does not exist" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.700896 4909 scope.go:117] "RemoveContainer" containerID="5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59" Oct 02 19:25:48 crc kubenswrapper[4909]: E1002 19:25:48.701273 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59\": container with ID starting with 5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59 not found: ID does not exist" containerID="5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59" Oct 02 19:25:48 crc kubenswrapper[4909]: I1002 19:25:48.701312 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59"} err="failed to get container status \"5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59\": rpc error: code = NotFound desc = could not find container \"5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59\": container with ID starting with 5484b0801bff76ab6ae0dd7efcd460df0b0a60e183cb7ec285ad25830d4bbe59 not found: ID does not exist" Oct 02 19:25:49 crc kubenswrapper[4909]: I1002 19:25:49.513496 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hbk5"] Oct 02 19:25:49 crc kubenswrapper[4909]: I1002 19:25:49.622571 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" path="/var/lib/kubelet/pods/200a73a4-7b3d-4b70-aff7-1e8511a3a81a/volumes" Oct 02 19:25:49 crc kubenswrapper[4909]: I1002 19:25:49.911507 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nz5sb"] Oct 02 19:25:49 crc kubenswrapper[4909]: I1002 19:25:49.911790 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nz5sb" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="registry-server" containerID="cri-o://d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877" gracePeriod=2 Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.433209 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.566848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k558\" (UniqueName: \"kubernetes.io/projected/27111323-170a-4dfd-9620-8062abd30b0f-kube-api-access-7k558\") pod \"27111323-170a-4dfd-9620-8062abd30b0f\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.566918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-catalog-content\") pod \"27111323-170a-4dfd-9620-8062abd30b0f\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.567020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-utilities\") pod \"27111323-170a-4dfd-9620-8062abd30b0f\" (UID: \"27111323-170a-4dfd-9620-8062abd30b0f\") " Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.570871 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-utilities" (OuterVolumeSpecName: "utilities") pod "27111323-170a-4dfd-9620-8062abd30b0f" (UID: "27111323-170a-4dfd-9620-8062abd30b0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.573955 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27111323-170a-4dfd-9620-8062abd30b0f-kube-api-access-7k558" (OuterVolumeSpecName: "kube-api-access-7k558") pod "27111323-170a-4dfd-9620-8062abd30b0f" (UID: "27111323-170a-4dfd-9620-8062abd30b0f"). InnerVolumeSpecName "kube-api-access-7k558". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.577815 4909 generic.go:334] "Generic (PLEG): container finished" podID="27111323-170a-4dfd-9620-8062abd30b0f" containerID="d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877" exitCode=0 Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.577861 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nz5sb" event={"ID":"27111323-170a-4dfd-9620-8062abd30b0f","Type":"ContainerDied","Data":"d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877"} Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.577887 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nz5sb" event={"ID":"27111323-170a-4dfd-9620-8062abd30b0f","Type":"ContainerDied","Data":"079d3f39699f2723310abdcb043fd36e42cd7797568cac3c8aba229b18b532e4"} Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.577889 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nz5sb" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.577903 4909 scope.go:117] "RemoveContainer" containerID="d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.644234 4909 scope.go:117] "RemoveContainer" containerID="0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.670197 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k558\" (UniqueName: \"kubernetes.io/projected/27111323-170a-4dfd-9620-8062abd30b0f-kube-api-access-7k558\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.670246 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.671762 4909 scope.go:117] "RemoveContainer" containerID="c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.680323 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27111323-170a-4dfd-9620-8062abd30b0f" (UID: "27111323-170a-4dfd-9620-8062abd30b0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.713377 4909 scope.go:117] "RemoveContainer" containerID="d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877" Oct 02 19:25:50 crc kubenswrapper[4909]: E1002 19:25:50.713786 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877\": container with ID starting with d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877 not found: ID does not exist" containerID="d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.713849 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877"} err="failed to get container status \"d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877\": rpc error: code = NotFound desc = could not find container \"d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877\": container with ID starting with d95b4d600dd948a707b639ba0b0137e0b48cb60bd2f46bc9a85cecf359e61877 not found: ID does not exist" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.713894 4909 scope.go:117] "RemoveContainer" containerID="0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c" Oct 02 19:25:50 crc kubenswrapper[4909]: E1002 19:25:50.714528 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c\": container with ID starting with 0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c not found: ID does not exist" containerID="0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.714571 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c"} err="failed to get container status \"0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c\": rpc error: code = NotFound desc = could not find container \"0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c\": container with ID starting with 0f59a70dd101b8dc8bc9516153b8ba9388a84c3d5e9567e28af85b75c559e59c not found: ID does not exist" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.714597 4909 scope.go:117] "RemoveContainer" containerID="c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0" Oct 02 19:25:50 crc kubenswrapper[4909]: E1002 19:25:50.714897 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0\": container with ID starting with c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0 not found: ID does not exist" containerID="c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.714941 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0"} err="failed to get container status \"c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0\": rpc error: code = NotFound desc = could not find container \"c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0\": container with ID starting with c79915d0fd1d2f483f9d930f3f73df0221f11493cf10aa6c6be6c52019cccee0 not found: ID does not exist" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.772787 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27111323-170a-4dfd-9620-8062abd30b0f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.947067 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nz5sb"] Oct 02 19:25:50 crc kubenswrapper[4909]: I1002 19:25:50.958474 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nz5sb"] Oct 02 19:25:51 crc kubenswrapper[4909]: I1002 19:25:51.635436 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27111323-170a-4dfd-9620-8062abd30b0f" path="/var/lib/kubelet/pods/27111323-170a-4dfd-9620-8062abd30b0f/volumes" Oct 02 19:26:08 crc kubenswrapper[4909]: I1002 19:26:08.820185 4909 generic.go:334] "Generic (PLEG): container finished" podID="e917646f-ee0c-442e-8a71-d637ef36a45e" containerID="83f267429cc7ca065d992541443cedcdc63349052402aa7b14f2cbb165558826" exitCode=0 Oct 02 19:26:08 crc kubenswrapper[4909]: I1002 19:26:08.820263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" event={"ID":"e917646f-ee0c-442e-8a71-d637ef36a45e","Type":"ContainerDied","Data":"83f267429cc7ca065d992541443cedcdc63349052402aa7b14f2cbb165558826"} Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.369554 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.545843 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7xnz\" (UniqueName: \"kubernetes.io/projected/e917646f-ee0c-442e-8a71-d637ef36a45e-kube-api-access-n7xnz\") pod \"e917646f-ee0c-442e-8a71-d637ef36a45e\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.546045 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ssh-key\") pod \"e917646f-ee0c-442e-8a71-d637ef36a45e\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.546080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-inventory\") pod \"e917646f-ee0c-442e-8a71-d637ef36a45e\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.546216 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-combined-ca-bundle\") pod \"e917646f-ee0c-442e-8a71-d637ef36a45e\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.546296 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-secret-0\") pod \"e917646f-ee0c-442e-8a71-d637ef36a45e\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.546379 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ceph\") pod \"e917646f-ee0c-442e-8a71-d637ef36a45e\" (UID: \"e917646f-ee0c-442e-8a71-d637ef36a45e\") " Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.555278 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ceph" (OuterVolumeSpecName: "ceph") pod "e917646f-ee0c-442e-8a71-d637ef36a45e" (UID: "e917646f-ee0c-442e-8a71-d637ef36a45e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.556937 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e917646f-ee0c-442e-8a71-d637ef36a45e-kube-api-access-n7xnz" (OuterVolumeSpecName: "kube-api-access-n7xnz") pod "e917646f-ee0c-442e-8a71-d637ef36a45e" (UID: "e917646f-ee0c-442e-8a71-d637ef36a45e"). InnerVolumeSpecName "kube-api-access-n7xnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.556898 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e917646f-ee0c-442e-8a71-d637ef36a45e" (UID: "e917646f-ee0c-442e-8a71-d637ef36a45e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.578590 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-inventory" (OuterVolumeSpecName: "inventory") pod "e917646f-ee0c-442e-8a71-d637ef36a45e" (UID: "e917646f-ee0c-442e-8a71-d637ef36a45e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.601873 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e917646f-ee0c-442e-8a71-d637ef36a45e" (UID: "e917646f-ee0c-442e-8a71-d637ef36a45e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.605016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e917646f-ee0c-442e-8a71-d637ef36a45e" (UID: "e917646f-ee0c-442e-8a71-d637ef36a45e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.649001 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7xnz\" (UniqueName: \"kubernetes.io/projected/e917646f-ee0c-442e-8a71-d637ef36a45e-kube-api-access-n7xnz\") on node \"crc\" DevicePath \"\"" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.649059 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.649073 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.649173 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.649189 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.649200 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e917646f-ee0c-442e-8a71-d637ef36a45e-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.844420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" event={"ID":"e917646f-ee0c-442e-8a71-d637ef36a45e","Type":"ContainerDied","Data":"d3f7ec1649ce312ad9343d27e351004de634b0bf2d73cfee196b9314441ba97b"} Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.844515 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3f7ec1649ce312ad9343d27e351004de634b0bf2d73cfee196b9314441ba97b" Oct 02 19:26:10 crc kubenswrapper[4909]: I1002 19:26:10.844456 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.058459 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t"] Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059362 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="extract-content" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059387 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="extract-content" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059412 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="extract-utilities" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059421 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="extract-utilities" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059450 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059464 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059472 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059493 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="extract-utilities" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059503 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="extract-utilities" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059524 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="extract-content" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059531 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="extract-content" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059546 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="extract-content" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059554 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="extract-content" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059566 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059573 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059592 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e917646f-ee0c-442e-8a71-d637ef36a45e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059600 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e917646f-ee0c-442e-8a71-d637ef36a45e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 19:26:11 crc kubenswrapper[4909]: E1002 19:26:11.059610 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="extract-utilities" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059619 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="extract-utilities" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059899 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f389326-bf5f-4f35-9a9d-fcc41b3c5c71" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059923 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="27111323-170a-4dfd-9620-8062abd30b0f" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059945 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e917646f-ee0c-442e-8a71-d637ef36a45e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.059959 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="200a73a4-7b3d-4b70-aff7-1e8511a3a81a" containerName="registry-server" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.061157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.065768 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.065887 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.066193 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.066276 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.066539 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.066717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.067383 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.068124 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.068659 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.074553 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t"] Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160498 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160642 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmlp\" (UniqueName: \"kubernetes.io/projected/d535ce44-b440-434f-8526-3dd777d90ae8-kube-api-access-ppmlp\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160695 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160744 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160816 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160852 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160878 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.160955 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.161051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.161098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.161191 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmlp\" (UniqueName: \"kubernetes.io/projected/d535ce44-b440-434f-8526-3dd777d90ae8-kube-api-access-ppmlp\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263173 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263215 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263441 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.263516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.264564 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.264572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.268537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.268619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.268706 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.268824 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.269962 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.270143 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.271944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.273540 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.282306 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmlp\" (UniqueName: \"kubernetes.io/projected/d535ce44-b440-434f-8526-3dd777d90ae8-kube-api-access-ppmlp\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.386448 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:26:11 crc kubenswrapper[4909]: W1002 19:26:11.984739 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd535ce44_b440_434f_8526_3dd777d90ae8.slice/crio-4f003197861fd709e451c9b05c2500819d3f97615679261e611abf50699ee9cf WatchSource:0}: Error finding container 4f003197861fd709e451c9b05c2500819d3f97615679261e611abf50699ee9cf: Status 404 returned error can't find the container with id 4f003197861fd709e451c9b05c2500819d3f97615679261e611abf50699ee9cf Oct 02 19:26:11 crc kubenswrapper[4909]: I1002 19:26:11.985680 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t"] Oct 02 19:26:12 crc kubenswrapper[4909]: I1002 19:26:12.875146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" event={"ID":"d535ce44-b440-434f-8526-3dd777d90ae8","Type":"ContainerStarted","Data":"5e5f3be4112e77b68b7395197aed20187df4e5d43f04d5b007b45f63f8ef3afd"} Oct 02 19:26:12 crc kubenswrapper[4909]: I1002 19:26:12.876017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" event={"ID":"d535ce44-b440-434f-8526-3dd777d90ae8","Type":"ContainerStarted","Data":"4f003197861fd709e451c9b05c2500819d3f97615679261e611abf50699ee9cf"} Oct 02 19:26:12 crc kubenswrapper[4909]: I1002 19:26:12.915084 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" podStartSLOduration=1.4622401520000001 podStartE2EDuration="1.915014122s" podCreationTimestamp="2025-10-02 19:26:11 +0000 UTC" firstStartedPulling="2025-10-02 19:26:11.987293612 +0000 UTC m=+4093.174789511" lastFinishedPulling="2025-10-02 19:26:12.440067622 +0000 UTC m=+4093.627563481" observedRunningTime="2025-10-02 19:26:12.909309694 +0000 UTC m=+4094.096805583" watchObservedRunningTime="2025-10-02 19:26:12.915014122 +0000 UTC m=+4094.102510021" Oct 02 19:27:23 crc kubenswrapper[4909]: I1002 19:27:23.054499 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:27:23 crc kubenswrapper[4909]: I1002 19:27:23.055223 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:27:53 crc kubenswrapper[4909]: I1002 19:27:53.055158 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:27:53 crc kubenswrapper[4909]: I1002 19:27:53.055903 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.054953 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.055456 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.055501 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.056284 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0210316922fb9936b7e40719074195ee6f92c8eb0ba0771c217dadc7636625b"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.056427 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://c0210316922fb9936b7e40719074195ee6f92c8eb0ba0771c217dadc7636625b" gracePeriod=600 Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.410426 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="c0210316922fb9936b7e40719074195ee6f92c8eb0ba0771c217dadc7636625b" exitCode=0 Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.410502 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"c0210316922fb9936b7e40719074195ee6f92c8eb0ba0771c217dadc7636625b"} Oct 02 19:28:23 crc kubenswrapper[4909]: I1002 19:28:23.410555 4909 scope.go:117] "RemoveContainer" containerID="a39fbfc3551ae08cf18694dd37ea17231752b72dccc5d71919c95bca107ba756" Oct 02 19:28:24 crc kubenswrapper[4909]: I1002 19:28:24.421726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5"} Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.170377 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t"] Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.173724 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.176454 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.176701 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.185936 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t"] Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.311842 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-config-volume\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.311900 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzkn\" (UniqueName: \"kubernetes.io/projected/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-kube-api-access-mwzkn\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.311922 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-secret-volume\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.413894 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-config-volume\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.414240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzkn\" (UniqueName: \"kubernetes.io/projected/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-kube-api-access-mwzkn\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.414335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-secret-volume\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.415737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-config-volume\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.420182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-secret-volume\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.443394 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzkn\" (UniqueName: \"kubernetes.io/projected/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-kube-api-access-mwzkn\") pod \"collect-profiles-29323890-xrk6t\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:00 crc kubenswrapper[4909]: I1002 19:30:00.538616 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:01 crc kubenswrapper[4909]: I1002 19:30:01.021982 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t"] Oct 02 19:30:01 crc kubenswrapper[4909]: W1002 19:30:01.025045 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c48ce2_b55d_4a49_a3e7_1fc253d6c668.slice/crio-1d2b1316345e40670d3ef0daae8e4b82a93f0e8067771eeddf0b0e171b7a024d WatchSource:0}: Error finding container 1d2b1316345e40670d3ef0daae8e4b82a93f0e8067771eeddf0b0e171b7a024d: Status 404 returned error can't find the container with id 1d2b1316345e40670d3ef0daae8e4b82a93f0e8067771eeddf0b0e171b7a024d Oct 02 19:30:01 crc kubenswrapper[4909]: I1002 19:30:01.442042 4909 generic.go:334] "Generic (PLEG): container finished" podID="75c48ce2-b55d-4a49-a3e7-1fc253d6c668" containerID="f0e8da85ea2b32598b2ec2464fa9d7ceec7349d045518219bd41f85cca9429b7" exitCode=0 Oct 02 19:30:01 crc kubenswrapper[4909]: I1002 19:30:01.442104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" event={"ID":"75c48ce2-b55d-4a49-a3e7-1fc253d6c668","Type":"ContainerDied","Data":"f0e8da85ea2b32598b2ec2464fa9d7ceec7349d045518219bd41f85cca9429b7"} Oct 02 19:30:01 crc kubenswrapper[4909]: I1002 19:30:01.442425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" event={"ID":"75c48ce2-b55d-4a49-a3e7-1fc253d6c668","Type":"ContainerStarted","Data":"1d2b1316345e40670d3ef0daae8e4b82a93f0e8067771eeddf0b0e171b7a024d"} Oct 02 19:30:02 crc kubenswrapper[4909]: I1002 19:30:02.921374 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:02 crc kubenswrapper[4909]: I1002 19:30:02.987467 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-config-volume\") pod \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " Oct 02 19:30:02 crc kubenswrapper[4909]: I1002 19:30:02.987626 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-secret-volume\") pod \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " Oct 02 19:30:02 crc kubenswrapper[4909]: I1002 19:30:02.987726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwzkn\" (UniqueName: \"kubernetes.io/projected/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-kube-api-access-mwzkn\") pod \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\" (UID: \"75c48ce2-b55d-4a49-a3e7-1fc253d6c668\") " Oct 02 19:30:02 crc kubenswrapper[4909]: I1002 19:30:02.988275 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-config-volume" (OuterVolumeSpecName: "config-volume") pod "75c48ce2-b55d-4a49-a3e7-1fc253d6c668" (UID: "75c48ce2-b55d-4a49-a3e7-1fc253d6c668"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.091017 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.466478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" event={"ID":"75c48ce2-b55d-4a49-a3e7-1fc253d6c668","Type":"ContainerDied","Data":"1d2b1316345e40670d3ef0daae8e4b82a93f0e8067771eeddf0b0e171b7a024d"} Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.466808 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d2b1316345e40670d3ef0daae8e4b82a93f0e8067771eeddf0b0e171b7a024d" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.466870 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.583450 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-kube-api-access-mwzkn" (OuterVolumeSpecName: "kube-api-access-mwzkn") pod "75c48ce2-b55d-4a49-a3e7-1fc253d6c668" (UID: "75c48ce2-b55d-4a49-a3e7-1fc253d6c668"). InnerVolumeSpecName "kube-api-access-mwzkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.583552 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75c48ce2-b55d-4a49-a3e7-1fc253d6c668" (UID: "75c48ce2-b55d-4a49-a3e7-1fc253d6c668"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.604675 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:30:03 crc kubenswrapper[4909]: I1002 19:30:03.604732 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwzkn\" (UniqueName: \"kubernetes.io/projected/75c48ce2-b55d-4a49-a3e7-1fc253d6c668-kube-api-access-mwzkn\") on node \"crc\" DevicePath \"\"" Oct 02 19:30:04 crc kubenswrapper[4909]: I1002 19:30:04.002766 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms"] Oct 02 19:30:04 crc kubenswrapper[4909]: I1002 19:30:04.011338 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323845-f57ms"] Oct 02 19:30:05 crc kubenswrapper[4909]: I1002 19:30:05.619409 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914fac7d-d3a1-4cb1-b2d7-8e8821f08e15" path="/var/lib/kubelet/pods/914fac7d-d3a1-4cb1-b2d7-8e8821f08e15/volumes" Oct 02 19:30:23 crc kubenswrapper[4909]: I1002 19:30:23.054067 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:30:23 crc kubenswrapper[4909]: I1002 19:30:23.054578 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:30:25 crc kubenswrapper[4909]: I1002 19:30:25.438868 4909 scope.go:117] "RemoveContainer" containerID="04dcec55e75c6baf2c55053dd968c5fc5960fec99a4a5909136dc166788b6cc1" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.554078 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rr2bg"] Oct 02 19:30:34 crc kubenswrapper[4909]: E1002 19:30:34.555890 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c48ce2-b55d-4a49-a3e7-1fc253d6c668" containerName="collect-profiles" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.555914 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c48ce2-b55d-4a49-a3e7-1fc253d6c668" containerName="collect-profiles" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.556413 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c48ce2-b55d-4a49-a3e7-1fc253d6c668" containerName="collect-profiles" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.559154 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.565899 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rr2bg"] Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.752813 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-utilities\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.752953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-catalog-content\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.753012 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cb8c\" (UniqueName: \"kubernetes.io/projected/6b63eda1-b0b2-4ada-9a03-ea17d38de933-kube-api-access-9cb8c\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.854935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-utilities\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.855152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-catalog-content\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.855183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cb8c\" (UniqueName: \"kubernetes.io/projected/6b63eda1-b0b2-4ada-9a03-ea17d38de933-kube-api-access-9cb8c\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.855903 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-utilities\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.856121 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-catalog-content\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.877933 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cb8c\" (UniqueName: \"kubernetes.io/projected/6b63eda1-b0b2-4ada-9a03-ea17d38de933-kube-api-access-9cb8c\") pod \"redhat-operators-rr2bg\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:34 crc kubenswrapper[4909]: I1002 19:30:34.892103 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:35 crc kubenswrapper[4909]: I1002 19:30:35.409307 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rr2bg"] Oct 02 19:30:35 crc kubenswrapper[4909]: I1002 19:30:35.860645 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerID="9fab0973eee8a08228b2ab0ccaef67958fbe5199daefae6d17b199edb42eca23" exitCode=0 Oct 02 19:30:35 crc kubenswrapper[4909]: I1002 19:30:35.860747 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerDied","Data":"9fab0973eee8a08228b2ab0ccaef67958fbe5199daefae6d17b199edb42eca23"} Oct 02 19:30:35 crc kubenswrapper[4909]: I1002 19:30:35.860899 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerStarted","Data":"1cf4c81af61f53277921dd0d7189662c302131bb3e23de8ee610d5dc6cdc083c"} Oct 02 19:30:35 crc kubenswrapper[4909]: I1002 19:30:35.863870 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:30:37 crc kubenswrapper[4909]: I1002 19:30:37.908792 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerStarted","Data":"2877fd62ab348352a205d1e1b7c6a03a7b889ddeaa65a003f5b2fcd756f64e51"} Oct 02 19:30:40 crc kubenswrapper[4909]: I1002 19:30:40.961969 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerID="2877fd62ab348352a205d1e1b7c6a03a7b889ddeaa65a003f5b2fcd756f64e51" exitCode=0 Oct 02 19:30:40 crc kubenswrapper[4909]: I1002 19:30:40.962642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerDied","Data":"2877fd62ab348352a205d1e1b7c6a03a7b889ddeaa65a003f5b2fcd756f64e51"} Oct 02 19:30:41 crc kubenswrapper[4909]: I1002 19:30:41.974826 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerStarted","Data":"9e7043a1bb8431906f4360087d7169a109af3bf628b42471cb12f3019574da52"} Oct 02 19:30:42 crc kubenswrapper[4909]: I1002 19:30:42.005683 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rr2bg" podStartSLOduration=2.42228464 podStartE2EDuration="8.005662957s" podCreationTimestamp="2025-10-02 19:30:34 +0000 UTC" firstStartedPulling="2025-10-02 19:30:35.863633917 +0000 UTC m=+4357.051129776" lastFinishedPulling="2025-10-02 19:30:41.447012234 +0000 UTC m=+4362.634508093" observedRunningTime="2025-10-02 19:30:41.998772295 +0000 UTC m=+4363.186268154" watchObservedRunningTime="2025-10-02 19:30:42.005662957 +0000 UTC m=+4363.193158816" Oct 02 19:30:44 crc kubenswrapper[4909]: I1002 19:30:44.892568 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:44 crc kubenswrapper[4909]: I1002 19:30:44.892623 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:45 crc kubenswrapper[4909]: I1002 19:30:45.954637 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rr2bg" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="registry-server" probeResult="failure" output=< Oct 02 19:30:45 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:30:45 crc kubenswrapper[4909]: > Oct 02 19:30:53 crc kubenswrapper[4909]: I1002 19:30:53.054139 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:30:53 crc kubenswrapper[4909]: I1002 19:30:53.054769 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:30:55 crc kubenswrapper[4909]: I1002 19:30:55.549535 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:55 crc kubenswrapper[4909]: I1002 19:30:55.599176 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:55 crc kubenswrapper[4909]: I1002 19:30:55.789318 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rr2bg"] Oct 02 19:30:57 crc kubenswrapper[4909]: I1002 19:30:57.158152 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rr2bg" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="registry-server" containerID="cri-o://9e7043a1bb8431906f4360087d7169a109af3bf628b42471cb12f3019574da52" gracePeriod=2 Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.172375 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerID="9e7043a1bb8431906f4360087d7169a109af3bf628b42471cb12f3019574da52" exitCode=0 Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.172473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerDied","Data":"9e7043a1bb8431906f4360087d7169a109af3bf628b42471cb12f3019574da52"} Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.313430 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.431707 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cb8c\" (UniqueName: \"kubernetes.io/projected/6b63eda1-b0b2-4ada-9a03-ea17d38de933-kube-api-access-9cb8c\") pod \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.431771 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-catalog-content\") pod \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.431881 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-utilities\") pod \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\" (UID: \"6b63eda1-b0b2-4ada-9a03-ea17d38de933\") " Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.433242 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-utilities" (OuterVolumeSpecName: "utilities") pod "6b63eda1-b0b2-4ada-9a03-ea17d38de933" (UID: "6b63eda1-b0b2-4ada-9a03-ea17d38de933"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.437549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b63eda1-b0b2-4ada-9a03-ea17d38de933-kube-api-access-9cb8c" (OuterVolumeSpecName: "kube-api-access-9cb8c") pod "6b63eda1-b0b2-4ada-9a03-ea17d38de933" (UID: "6b63eda1-b0b2-4ada-9a03-ea17d38de933"). InnerVolumeSpecName "kube-api-access-9cb8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.514094 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b63eda1-b0b2-4ada-9a03-ea17d38de933" (UID: "6b63eda1-b0b2-4ada-9a03-ea17d38de933"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.534429 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.534462 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b63eda1-b0b2-4ada-9a03-ea17d38de933-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:30:58 crc kubenswrapper[4909]: I1002 19:30:58.534474 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cb8c\" (UniqueName: \"kubernetes.io/projected/6b63eda1-b0b2-4ada-9a03-ea17d38de933-kube-api-access-9cb8c\") on node \"crc\" DevicePath \"\"" Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.192278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rr2bg" event={"ID":"6b63eda1-b0b2-4ada-9a03-ea17d38de933","Type":"ContainerDied","Data":"1cf4c81af61f53277921dd0d7189662c302131bb3e23de8ee610d5dc6cdc083c"} Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.192383 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rr2bg" Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.192769 4909 scope.go:117] "RemoveContainer" containerID="9e7043a1bb8431906f4360087d7169a109af3bf628b42471cb12f3019574da52" Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.231333 4909 scope.go:117] "RemoveContainer" containerID="2877fd62ab348352a205d1e1b7c6a03a7b889ddeaa65a003f5b2fcd756f64e51" Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.268426 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rr2bg"] Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.279445 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rr2bg"] Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.304759 4909 scope.go:117] "RemoveContainer" containerID="9fab0973eee8a08228b2ab0ccaef67958fbe5199daefae6d17b199edb42eca23" Oct 02 19:30:59 crc kubenswrapper[4909]: I1002 19:30:59.633473 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" path="/var/lib/kubelet/pods/6b63eda1-b0b2-4ada-9a03-ea17d38de933/volumes" Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.054730 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.055449 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.055517 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.056814 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.057126 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" gracePeriod=600 Oct 02 19:31:23 crc kubenswrapper[4909]: E1002 19:31:23.191928 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.512391 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" exitCode=0 Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.512726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5"} Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.512759 4909 scope.go:117] "RemoveContainer" containerID="c0210316922fb9936b7e40719074195ee6f92c8eb0ba0771c217dadc7636625b" Oct 02 19:31:23 crc kubenswrapper[4909]: I1002 19:31:23.513396 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:31:23 crc kubenswrapper[4909]: E1002 19:31:23.513628 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:31:33 crc kubenswrapper[4909]: I1002 19:31:33.645711 4909 generic.go:334] "Generic (PLEG): container finished" podID="d535ce44-b440-434f-8526-3dd777d90ae8" containerID="5e5f3be4112e77b68b7395197aed20187df4e5d43f04d5b007b45f63f8ef3afd" exitCode=0 Oct 02 19:31:33 crc kubenswrapper[4909]: I1002 19:31:33.645799 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" event={"ID":"d535ce44-b440-434f-8526-3dd777d90ae8","Type":"ContainerDied","Data":"5e5f3be4112e77b68b7395197aed20187df4e5d43f04d5b007b45f63f8ef3afd"} Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.166098 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.323859 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-inventory\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.323972 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppmlp\" (UniqueName: \"kubernetes.io/projected/d535ce44-b440-434f-8526-3dd777d90ae8-kube-api-access-ppmlp\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324157 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-custom-ceph-combined-ca-bundle\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-nova-extra-config-0\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ssh-key\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324353 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-1\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-0\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324509 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-1\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324589 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-0\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-ceph-nova-0\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.324667 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ceph\") pod \"d535ce44-b440-434f-8526-3dd777d90ae8\" (UID: \"d535ce44-b440-434f-8526-3dd777d90ae8\") " Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.333418 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d535ce44-b440-434f-8526-3dd777d90ae8-kube-api-access-ppmlp" (OuterVolumeSpecName: "kube-api-access-ppmlp") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "kube-api-access-ppmlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.335044 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.353368 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ceph" (OuterVolumeSpecName: "ceph") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.363145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.371498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.373460 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.374104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-inventory" (OuterVolumeSpecName: "inventory") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.377623 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.385305 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.388534 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.394426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d535ce44-b440-434f-8526-3dd777d90ae8" (UID: "d535ce44-b440-434f-8526-3dd777d90ae8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428385 4909 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428420 4909 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428432 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428444 4909 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428457 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428468 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428478 4909 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428492 4909 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d535ce44-b440-434f-8526-3dd777d90ae8-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428502 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428511 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d535ce44-b440-434f-8526-3dd777d90ae8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.428520 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppmlp\" (UniqueName: \"kubernetes.io/projected/d535ce44-b440-434f-8526-3dd777d90ae8-kube-api-access-ppmlp\") on node \"crc\" DevicePath \"\"" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.609288 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:31:35 crc kubenswrapper[4909]: E1002 19:31:35.609946 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.667012 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" event={"ID":"d535ce44-b440-434f-8526-3dd777d90ae8","Type":"ContainerDied","Data":"4f003197861fd709e451c9b05c2500819d3f97615679261e611abf50699ee9cf"} Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.667098 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f003197861fd709e451c9b05c2500819d3f97615679261e611abf50699ee9cf" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.667115 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.854480 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427"] Oct 02 19:31:35 crc kubenswrapper[4909]: E1002 19:31:35.855059 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="registry-server" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.855088 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="registry-server" Oct 02 19:31:35 crc kubenswrapper[4909]: E1002 19:31:35.855115 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="extract-content" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.855128 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="extract-content" Oct 02 19:31:35 crc kubenswrapper[4909]: E1002 19:31:35.855143 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="extract-utilities" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.855153 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="extract-utilities" Oct 02 19:31:35 crc kubenswrapper[4909]: E1002 19:31:35.855168 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d535ce44-b440-434f-8526-3dd777d90ae8" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.855177 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d535ce44-b440-434f-8526-3dd777d90ae8" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.855452 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d535ce44-b440-434f-8526-3dd777d90ae8" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.855486 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b63eda1-b0b2-4ada-9a03-ea17d38de933" containerName="registry-server" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.856406 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.860849 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.861115 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.861229 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.861236 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.861330 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.862433 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.869170 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427"] Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.939429 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvm97\" (UniqueName: \"kubernetes.io/projected/fc3aeeca-599a-4f61-92d4-9a09ad65206f-kube-api-access-hvm97\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.939908 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.940056 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.940249 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.940412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.940592 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.940711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:35 crc kubenswrapper[4909]: I1002 19:31:35.940942 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.043195 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.043453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvm97\" (UniqueName: \"kubernetes.io/projected/fc3aeeca-599a-4f61-92d4-9a09ad65206f-kube-api-access-hvm97\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.043549 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.043659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.043820 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.043967 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.044155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.044301 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.048915 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.049628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.049765 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.050063 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.050094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.051684 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.052430 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.066231 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvm97\" (UniqueName: \"kubernetes.io/projected/fc3aeeca-599a-4f61-92d4-9a09ad65206f-kube-api-access-hvm97\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k427\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:36 crc kubenswrapper[4909]: I1002 19:31:36.192847 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:31:37 crc kubenswrapper[4909]: I1002 19:31:36.762933 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427"] Oct 02 19:31:37 crc kubenswrapper[4909]: I1002 19:31:37.721400 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" event={"ID":"fc3aeeca-599a-4f61-92d4-9a09ad65206f","Type":"ContainerStarted","Data":"b6ab0f0dc8dff25bec5770f91906c59840537d17594f45f36aa120f47f8810a9"} Oct 02 19:31:38 crc kubenswrapper[4909]: I1002 19:31:38.737379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" event={"ID":"fc3aeeca-599a-4f61-92d4-9a09ad65206f","Type":"ContainerStarted","Data":"ff78bc343a63b6ac96c353dfa8068ef634a0869c99d09c6c522fa5112b0f20b3"} Oct 02 19:31:38 crc kubenswrapper[4909]: I1002 19:31:38.761195 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" podStartSLOduration=2.96151223 podStartE2EDuration="3.761168035s" podCreationTimestamp="2025-10-02 19:31:35 +0000 UTC" firstStartedPulling="2025-10-02 19:31:36.765357021 +0000 UTC m=+4417.952852880" lastFinishedPulling="2025-10-02 19:31:37.565012826 +0000 UTC m=+4418.752508685" observedRunningTime="2025-10-02 19:31:38.753753367 +0000 UTC m=+4419.941249236" watchObservedRunningTime="2025-10-02 19:31:38.761168035 +0000 UTC m=+4419.948663904" Oct 02 19:31:47 crc kubenswrapper[4909]: I1002 19:31:47.608659 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:31:47 crc kubenswrapper[4909]: E1002 19:31:47.609670 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:31:58 crc kubenswrapper[4909]: I1002 19:31:58.609109 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:31:58 crc kubenswrapper[4909]: E1002 19:31:58.610217 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:32:10 crc kubenswrapper[4909]: I1002 19:32:10.608863 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:32:10 crc kubenswrapper[4909]: E1002 19:32:10.609693 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:32:24 crc kubenswrapper[4909]: I1002 19:32:24.609413 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:32:24 crc kubenswrapper[4909]: E1002 19:32:24.610066 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:32:35 crc kubenswrapper[4909]: I1002 19:32:35.609244 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:32:35 crc kubenswrapper[4909]: E1002 19:32:35.609983 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:32:48 crc kubenswrapper[4909]: I1002 19:32:48.609340 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:32:48 crc kubenswrapper[4909]: E1002 19:32:48.610223 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:33:02 crc kubenswrapper[4909]: I1002 19:33:02.608793 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:33:02 crc kubenswrapper[4909]: E1002 19:33:02.609668 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:33:17 crc kubenswrapper[4909]: I1002 19:33:17.609752 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:33:17 crc kubenswrapper[4909]: E1002 19:33:17.610754 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:33:29 crc kubenswrapper[4909]: I1002 19:33:29.621154 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:33:29 crc kubenswrapper[4909]: E1002 19:33:29.622110 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:33:44 crc kubenswrapper[4909]: I1002 19:33:44.609583 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:33:44 crc kubenswrapper[4909]: E1002 19:33:44.610396 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:33:57 crc kubenswrapper[4909]: I1002 19:33:57.608352 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:33:57 crc kubenswrapper[4909]: E1002 19:33:57.609355 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:34:08 crc kubenswrapper[4909]: I1002 19:34:08.609695 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:34:08 crc kubenswrapper[4909]: E1002 19:34:08.610347 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:34:20 crc kubenswrapper[4909]: I1002 19:34:20.608545 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:34:20 crc kubenswrapper[4909]: E1002 19:34:20.609556 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:34:33 crc kubenswrapper[4909]: I1002 19:34:33.608989 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:34:33 crc kubenswrapper[4909]: E1002 19:34:33.610001 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:34:47 crc kubenswrapper[4909]: I1002 19:34:47.609167 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:34:47 crc kubenswrapper[4909]: E1002 19:34:47.609975 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:35:02 crc kubenswrapper[4909]: I1002 19:35:02.609327 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:35:02 crc kubenswrapper[4909]: E1002 19:35:02.610370 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:35:12 crc kubenswrapper[4909]: I1002 19:35:12.220565 4909 generic.go:334] "Generic (PLEG): container finished" podID="fc3aeeca-599a-4f61-92d4-9a09ad65206f" containerID="ff78bc343a63b6ac96c353dfa8068ef634a0869c99d09c6c522fa5112b0f20b3" exitCode=0 Oct 02 19:35:12 crc kubenswrapper[4909]: I1002 19:35:12.220707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" event={"ID":"fc3aeeca-599a-4f61-92d4-9a09ad65206f","Type":"ContainerDied","Data":"ff78bc343a63b6ac96c353dfa8068ef634a0869c99d09c6c522fa5112b0f20b3"} Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.791855 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902376 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-telemetry-combined-ca-bundle\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902458 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-1\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902595 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-2\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902629 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceph\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvm97\" (UniqueName: \"kubernetes.io/projected/fc3aeeca-599a-4f61-92d4-9a09ad65206f-kube-api-access-hvm97\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ssh-key\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902771 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-inventory\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.902848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-0\") pod \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\" (UID: \"fc3aeeca-599a-4f61-92d4-9a09ad65206f\") " Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.909278 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3aeeca-599a-4f61-92d4-9a09ad65206f-kube-api-access-hvm97" (OuterVolumeSpecName: "kube-api-access-hvm97") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "kube-api-access-hvm97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.909664 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceph" (OuterVolumeSpecName: "ceph") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.910712 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.940794 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.943005 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-inventory" (OuterVolumeSpecName: "inventory") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.944748 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.946326 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:13 crc kubenswrapper[4909]: I1002 19:35:13.946689 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "fc3aeeca-599a-4f61-92d4-9a09ad65206f" (UID: "fc3aeeca-599a-4f61-92d4-9a09ad65206f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006132 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006180 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006194 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvm97\" (UniqueName: \"kubernetes.io/projected/fc3aeeca-599a-4f61-92d4-9a09ad65206f-kube-api-access-hvm97\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006206 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006218 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006230 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006243 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.006256 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fc3aeeca-599a-4f61-92d4-9a09ad65206f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.245777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" event={"ID":"fc3aeeca-599a-4f61-92d4-9a09ad65206f","Type":"ContainerDied","Data":"b6ab0f0dc8dff25bec5770f91906c59840537d17594f45f36aa120f47f8810a9"} Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.245843 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6ab0f0dc8dff25bec5770f91906c59840537d17594f45f36aa120f47f8810a9" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.245894 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k427" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.371347 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24"] Oct 02 19:35:14 crc kubenswrapper[4909]: E1002 19:35:14.372300 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3aeeca-599a-4f61-92d4-9a09ad65206f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.372331 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3aeeca-599a-4f61-92d4-9a09ad65206f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.372620 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3aeeca-599a-4f61-92d4-9a09ad65206f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.376492 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.390319 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.390411 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.390831 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.391212 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.391423 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.391421 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.404899 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24"] Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.517331 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.517432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.517486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.518225 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.518441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.518518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.518835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.518956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8txm\" (UniqueName: \"kubernetes.io/projected/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-kube-api-access-n8txm\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.608674 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:35:14 crc kubenswrapper[4909]: E1002 19:35:14.609253 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.620908 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.620973 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.621001 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.621096 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.621136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8txm\" (UniqueName: \"kubernetes.io/projected/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-kube-api-access-n8txm\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.621215 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.621253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.621288 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.628303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.628363 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.629158 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.631017 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.631708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.636126 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.637841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.643625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8txm\" (UniqueName: \"kubernetes.io/projected/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-kube-api-access-n8txm\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:14 crc kubenswrapper[4909]: I1002 19:35:14.727634 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:35:15 crc kubenswrapper[4909]: I1002 19:35:15.358812 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24"] Oct 02 19:35:16 crc kubenswrapper[4909]: I1002 19:35:16.275044 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" event={"ID":"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7","Type":"ContainerStarted","Data":"943d946159f22f1369702e36b95cbe6c2e5ed489eb3e107fc630f7e90c04276c"} Oct 02 19:35:17 crc kubenswrapper[4909]: I1002 19:35:17.291658 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" event={"ID":"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7","Type":"ContainerStarted","Data":"c1b860631f75d77c4a4593b3d52296e4335bdd29689edabb5a04eaadb20d37ce"} Oct 02 19:35:17 crc kubenswrapper[4909]: I1002 19:35:17.323327 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" podStartSLOduration=2.818093351 podStartE2EDuration="3.323308428s" podCreationTimestamp="2025-10-02 19:35:14 +0000 UTC" firstStartedPulling="2025-10-02 19:35:15.920687364 +0000 UTC m=+4637.108183233" lastFinishedPulling="2025-10-02 19:35:16.425902441 +0000 UTC m=+4637.613398310" observedRunningTime="2025-10-02 19:35:17.318420778 +0000 UTC m=+4638.505916657" watchObservedRunningTime="2025-10-02 19:35:17.323308428 +0000 UTC m=+4638.510804287" Oct 02 19:35:27 crc kubenswrapper[4909]: I1002 19:35:27.608900 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:35:27 crc kubenswrapper[4909]: E1002 19:35:27.611061 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:35:39 crc kubenswrapper[4909]: I1002 19:35:39.615325 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:35:39 crc kubenswrapper[4909]: E1002 19:35:39.616670 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:35:53 crc kubenswrapper[4909]: I1002 19:35:53.608790 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:35:53 crc kubenswrapper[4909]: E1002 19:35:53.609939 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.175366 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xjnxc"] Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.179473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.184615 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjnxc"] Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.322630 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5r4v\" (UniqueName: \"kubernetes.io/projected/1bc77106-6865-405f-ad68-ae1b218128c4-kube-api-access-q5r4v\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.322739 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-utilities\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.322775 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-catalog-content\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.424599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5r4v\" (UniqueName: \"kubernetes.io/projected/1bc77106-6865-405f-ad68-ae1b218128c4-kube-api-access-q5r4v\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.425041 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-utilities\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.425134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-catalog-content\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.425954 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-catalog-content\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.426007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-utilities\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.456875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5r4v\" (UniqueName: \"kubernetes.io/projected/1bc77106-6865-405f-ad68-ae1b218128c4-kube-api-access-q5r4v\") pod \"certified-operators-xjnxc\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:56 crc kubenswrapper[4909]: I1002 19:35:56.511862 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:35:57 crc kubenswrapper[4909]: I1002 19:35:57.104224 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjnxc"] Oct 02 19:35:57 crc kubenswrapper[4909]: I1002 19:35:57.803650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerStarted","Data":"0be15857fdf1c9aac6776f64e4ffe04b828d747b6849dfc6d9480830b9f3309f"} Oct 02 19:35:58 crc kubenswrapper[4909]: I1002 19:35:58.822215 4909 generic.go:334] "Generic (PLEG): container finished" podID="1bc77106-6865-405f-ad68-ae1b218128c4" containerID="731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b" exitCode=0 Oct 02 19:35:58 crc kubenswrapper[4909]: I1002 19:35:58.822336 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerDied","Data":"731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b"} Oct 02 19:35:58 crc kubenswrapper[4909]: I1002 19:35:58.827498 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:35:59 crc kubenswrapper[4909]: I1002 19:35:59.838565 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerStarted","Data":"3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b"} Oct 02 19:36:00 crc kubenswrapper[4909]: I1002 19:36:00.852716 4909 generic.go:334] "Generic (PLEG): container finished" podID="1bc77106-6865-405f-ad68-ae1b218128c4" containerID="3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b" exitCode=0 Oct 02 19:36:00 crc kubenswrapper[4909]: I1002 19:36:00.852766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerDied","Data":"3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b"} Oct 02 19:36:01 crc kubenswrapper[4909]: I1002 19:36:01.864784 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerStarted","Data":"9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345"} Oct 02 19:36:01 crc kubenswrapper[4909]: I1002 19:36:01.885570 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xjnxc" podStartSLOduration=3.463024447 podStartE2EDuration="5.885550219s" podCreationTimestamp="2025-10-02 19:35:56 +0000 UTC" firstStartedPulling="2025-10-02 19:35:58.82710093 +0000 UTC m=+4680.014596829" lastFinishedPulling="2025-10-02 19:36:01.249626742 +0000 UTC m=+4682.437122601" observedRunningTime="2025-10-02 19:36:01.88140047 +0000 UTC m=+4683.068896349" watchObservedRunningTime="2025-10-02 19:36:01.885550219 +0000 UTC m=+4683.073046078" Oct 02 19:36:05 crc kubenswrapper[4909]: I1002 19:36:05.608999 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:36:05 crc kubenswrapper[4909]: E1002 19:36:05.609595 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:36:06 crc kubenswrapper[4909]: I1002 19:36:06.512614 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:36:06 crc kubenswrapper[4909]: I1002 19:36:06.512676 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:36:06 crc kubenswrapper[4909]: I1002 19:36:06.574352 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:36:06 crc kubenswrapper[4909]: I1002 19:36:06.997107 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:36:07 crc kubenswrapper[4909]: I1002 19:36:07.059266 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjnxc"] Oct 02 19:36:08 crc kubenswrapper[4909]: I1002 19:36:08.948501 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xjnxc" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="registry-server" containerID="cri-o://9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345" gracePeriod=2 Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.527863 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.633108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-catalog-content\") pod \"1bc77106-6865-405f-ad68-ae1b218128c4\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.633143 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5r4v\" (UniqueName: \"kubernetes.io/projected/1bc77106-6865-405f-ad68-ae1b218128c4-kube-api-access-q5r4v\") pod \"1bc77106-6865-405f-ad68-ae1b218128c4\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.633258 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-utilities\") pod \"1bc77106-6865-405f-ad68-ae1b218128c4\" (UID: \"1bc77106-6865-405f-ad68-ae1b218128c4\") " Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.634897 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-utilities" (OuterVolumeSpecName: "utilities") pod "1bc77106-6865-405f-ad68-ae1b218128c4" (UID: "1bc77106-6865-405f-ad68-ae1b218128c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.647235 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc77106-6865-405f-ad68-ae1b218128c4-kube-api-access-q5r4v" (OuterVolumeSpecName: "kube-api-access-q5r4v") pod "1bc77106-6865-405f-ad68-ae1b218128c4" (UID: "1bc77106-6865-405f-ad68-ae1b218128c4"). InnerVolumeSpecName "kube-api-access-q5r4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.697608 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bc77106-6865-405f-ad68-ae1b218128c4" (UID: "1bc77106-6865-405f-ad68-ae1b218128c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.737004 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.737077 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5r4v\" (UniqueName: \"kubernetes.io/projected/1bc77106-6865-405f-ad68-ae1b218128c4-kube-api-access-q5r4v\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.737101 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc77106-6865-405f-ad68-ae1b218128c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.963336 4909 generic.go:334] "Generic (PLEG): container finished" podID="1bc77106-6865-405f-ad68-ae1b218128c4" containerID="9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345" exitCode=0 Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.963409 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerDied","Data":"9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345"} Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.963472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjnxc" event={"ID":"1bc77106-6865-405f-ad68-ae1b218128c4","Type":"ContainerDied","Data":"0be15857fdf1c9aac6776f64e4ffe04b828d747b6849dfc6d9480830b9f3309f"} Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.963511 4909 scope.go:117] "RemoveContainer" containerID="9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.964119 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjnxc" Oct 02 19:36:09 crc kubenswrapper[4909]: I1002 19:36:09.996075 4909 scope.go:117] "RemoveContainer" containerID="3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.000847 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjnxc"] Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.011163 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xjnxc"] Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.698780 4909 scope.go:117] "RemoveContainer" containerID="731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.882340 4909 scope.go:117] "RemoveContainer" containerID="9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345" Oct 02 19:36:10 crc kubenswrapper[4909]: E1002 19:36:10.882826 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345\": container with ID starting with 9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345 not found: ID does not exist" containerID="9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.882884 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345"} err="failed to get container status \"9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345\": rpc error: code = NotFound desc = could not find container \"9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345\": container with ID starting with 9cd7b9d6f79fc96a92f7ea58b74fc559f4c5c615a1fd1fca86977d20cd3dc345 not found: ID does not exist" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.882921 4909 scope.go:117] "RemoveContainer" containerID="3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b" Oct 02 19:36:10 crc kubenswrapper[4909]: E1002 19:36:10.883372 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b\": container with ID starting with 3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b not found: ID does not exist" containerID="3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.883407 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b"} err="failed to get container status \"3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b\": rpc error: code = NotFound desc = could not find container \"3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b\": container with ID starting with 3d678f5e6fdf3c8567f0fc1cb551efad3d719dca692f273a0ccb3c7cd0b6c20b not found: ID does not exist" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.883428 4909 scope.go:117] "RemoveContainer" containerID="731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b" Oct 02 19:36:10 crc kubenswrapper[4909]: E1002 19:36:10.883832 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b\": container with ID starting with 731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b not found: ID does not exist" containerID="731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b" Oct 02 19:36:10 crc kubenswrapper[4909]: I1002 19:36:10.883872 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b"} err="failed to get container status \"731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b\": rpc error: code = NotFound desc = could not find container \"731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b\": container with ID starting with 731a6d9cc2237774ed5f2570bb0f5965516a0d02b1c3d140886a062fdd81ff9b not found: ID does not exist" Oct 02 19:36:11 crc kubenswrapper[4909]: I1002 19:36:11.626236 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" path="/var/lib/kubelet/pods/1bc77106-6865-405f-ad68-ae1b218128c4/volumes" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.309845 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2hrw8"] Oct 02 19:36:14 crc kubenswrapper[4909]: E1002 19:36:14.310892 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="registry-server" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.310915 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="registry-server" Oct 02 19:36:14 crc kubenswrapper[4909]: E1002 19:36:14.310973 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="extract-content" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.310987 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="extract-content" Oct 02 19:36:14 crc kubenswrapper[4909]: E1002 19:36:14.311049 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="extract-utilities" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.311065 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="extract-utilities" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.311460 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc77106-6865-405f-ad68-ae1b218128c4" containerName="registry-server" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.314214 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.328533 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hrw8"] Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.455780 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzgdd\" (UniqueName: \"kubernetes.io/projected/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-kube-api-access-gzgdd\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.455864 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-catalog-content\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.456463 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-utilities\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.559050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-utilities\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.559479 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-utilities\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.560240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzgdd\" (UniqueName: \"kubernetes.io/projected/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-kube-api-access-gzgdd\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.560588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-catalog-content\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.560952 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-catalog-content\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.582823 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzgdd\" (UniqueName: \"kubernetes.io/projected/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-kube-api-access-gzgdd\") pod \"redhat-marketplace-2hrw8\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:14 crc kubenswrapper[4909]: I1002 19:36:14.688978 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:15 crc kubenswrapper[4909]: I1002 19:36:15.159932 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hrw8"] Oct 02 19:36:16 crc kubenswrapper[4909]: I1002 19:36:16.062879 4909 generic.go:334] "Generic (PLEG): container finished" podID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerID="98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184" exitCode=0 Oct 02 19:36:16 crc kubenswrapper[4909]: I1002 19:36:16.062930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerDied","Data":"98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184"} Oct 02 19:36:16 crc kubenswrapper[4909]: I1002 19:36:16.063186 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerStarted","Data":"34c885225c72d7ff91d64ef741d96dd1c6969dd72648d2e05eb8ccc8fd2caeaa"} Oct 02 19:36:16 crc kubenswrapper[4909]: I1002 19:36:16.608845 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:36:16 crc kubenswrapper[4909]: E1002 19:36:16.609486 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:36:17 crc kubenswrapper[4909]: I1002 19:36:17.076437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerStarted","Data":"10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a"} Oct 02 19:36:18 crc kubenswrapper[4909]: I1002 19:36:18.092970 4909 generic.go:334] "Generic (PLEG): container finished" podID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerID="10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a" exitCode=0 Oct 02 19:36:18 crc kubenswrapper[4909]: I1002 19:36:18.093065 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerDied","Data":"10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a"} Oct 02 19:36:20 crc kubenswrapper[4909]: I1002 19:36:20.116040 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerStarted","Data":"964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21"} Oct 02 19:36:20 crc kubenswrapper[4909]: I1002 19:36:20.148788 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2hrw8" podStartSLOduration=3.277584954 podStartE2EDuration="6.148766898s" podCreationTimestamp="2025-10-02 19:36:14 +0000 UTC" firstStartedPulling="2025-10-02 19:36:16.065109991 +0000 UTC m=+4697.252605850" lastFinishedPulling="2025-10-02 19:36:18.936291925 +0000 UTC m=+4700.123787794" observedRunningTime="2025-10-02 19:36:20.136562671 +0000 UTC m=+4701.324058540" watchObservedRunningTime="2025-10-02 19:36:20.148766898 +0000 UTC m=+4701.336262757" Oct 02 19:36:24 crc kubenswrapper[4909]: I1002 19:36:24.689291 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:24 crc kubenswrapper[4909]: I1002 19:36:24.689831 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:24 crc kubenswrapper[4909]: I1002 19:36:24.747225 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:25 crc kubenswrapper[4909]: I1002 19:36:25.241790 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:25 crc kubenswrapper[4909]: I1002 19:36:25.283901 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hrw8"] Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.206809 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2hrw8" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="registry-server" containerID="cri-o://964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21" gracePeriod=2 Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.803959 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.976163 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzgdd\" (UniqueName: \"kubernetes.io/projected/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-kube-api-access-gzgdd\") pod \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.976753 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-utilities\") pod \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.976876 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-catalog-content\") pod \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\" (UID: \"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7\") " Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.977941 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-utilities" (OuterVolumeSpecName: "utilities") pod "c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" (UID: "c2fbb04e-f2c4-4a8f-8eab-051402eb99f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.985557 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-kube-api-access-gzgdd" (OuterVolumeSpecName: "kube-api-access-gzgdd") pod "c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" (UID: "c2fbb04e-f2c4-4a8f-8eab-051402eb99f7"). InnerVolumeSpecName "kube-api-access-gzgdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:36:27 crc kubenswrapper[4909]: I1002 19:36:27.992201 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" (UID: "c2fbb04e-f2c4-4a8f-8eab-051402eb99f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.080355 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzgdd\" (UniqueName: \"kubernetes.io/projected/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-kube-api-access-gzgdd\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.080397 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.080409 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.236537 4909 generic.go:334] "Generic (PLEG): container finished" podID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerID="964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21" exitCode=0 Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.236589 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerDied","Data":"964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21"} Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.236688 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hrw8" event={"ID":"c2fbb04e-f2c4-4a8f-8eab-051402eb99f7","Type":"ContainerDied","Data":"34c885225c72d7ff91d64ef741d96dd1c6969dd72648d2e05eb8ccc8fd2caeaa"} Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.236714 4909 scope.go:117] "RemoveContainer" containerID="964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.236776 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hrw8" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.285153 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hrw8"] Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.287231 4909 scope.go:117] "RemoveContainer" containerID="10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.302205 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hrw8"] Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.307447 4909 scope.go:117] "RemoveContainer" containerID="98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.350779 4909 scope.go:117] "RemoveContainer" containerID="964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21" Oct 02 19:36:28 crc kubenswrapper[4909]: E1002 19:36:28.351220 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21\": container with ID starting with 964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21 not found: ID does not exist" containerID="964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.351272 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21"} err="failed to get container status \"964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21\": rpc error: code = NotFound desc = could not find container \"964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21\": container with ID starting with 964fd6a4b815cb49d621f237d88bff26242fd93d4b8f7de89189210029424b21 not found: ID does not exist" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.351306 4909 scope.go:117] "RemoveContainer" containerID="10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a" Oct 02 19:36:28 crc kubenswrapper[4909]: E1002 19:36:28.351589 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a\": container with ID starting with 10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a not found: ID does not exist" containerID="10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.351625 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a"} err="failed to get container status \"10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a\": rpc error: code = NotFound desc = could not find container \"10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a\": container with ID starting with 10543141d94a83f54142e60dec3767a0c4d29f1c1235ff6e699172de29c46f2a not found: ID does not exist" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.351644 4909 scope.go:117] "RemoveContainer" containerID="98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184" Oct 02 19:36:28 crc kubenswrapper[4909]: E1002 19:36:28.352141 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184\": container with ID starting with 98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184 not found: ID does not exist" containerID="98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184" Oct 02 19:36:28 crc kubenswrapper[4909]: I1002 19:36:28.352173 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184"} err="failed to get container status \"98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184\": rpc error: code = NotFound desc = could not find container \"98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184\": container with ID starting with 98bc684a640dc81f3f2fe0fb54eefa1ea60b9723645d16ca241b375eed5c0184 not found: ID does not exist" Oct 02 19:36:29 crc kubenswrapper[4909]: I1002 19:36:29.625582 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" path="/var/lib/kubelet/pods/c2fbb04e-f2c4-4a8f-8eab-051402eb99f7/volumes" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.403856 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdjh2"] Oct 02 19:36:31 crc kubenswrapper[4909]: E1002 19:36:31.404838 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="registry-server" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.404868 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="registry-server" Oct 02 19:36:31 crc kubenswrapper[4909]: E1002 19:36:31.404960 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="extract-utilities" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.404981 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="extract-utilities" Oct 02 19:36:31 crc kubenswrapper[4909]: E1002 19:36:31.405051 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="extract-content" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.405071 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="extract-content" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.405585 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fbb04e-f2c4-4a8f-8eab-051402eb99f7" containerName="registry-server" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.409463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.417687 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdjh2"] Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.561526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qxl\" (UniqueName: \"kubernetes.io/projected/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-kube-api-access-l8qxl\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.561993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-utilities\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.562169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-catalog-content\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.608857 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.668010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-utilities\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.668234 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-catalog-content\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.668417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qxl\" (UniqueName: \"kubernetes.io/projected/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-kube-api-access-l8qxl\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.669757 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-utilities\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.670155 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-catalog-content\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.700281 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qxl\" (UniqueName: \"kubernetes.io/projected/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-kube-api-access-l8qxl\") pod \"community-operators-kdjh2\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:31 crc kubenswrapper[4909]: I1002 19:36:31.790925 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:32 crc kubenswrapper[4909]: I1002 19:36:32.286331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"5616c14478f166c035c2131cff63074d735206236b9d9a783fc76cd787dbc5ac"} Oct 02 19:36:32 crc kubenswrapper[4909]: I1002 19:36:32.322256 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdjh2"] Oct 02 19:36:33 crc kubenswrapper[4909]: I1002 19:36:33.304852 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerID="c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195" exitCode=0 Oct 02 19:36:33 crc kubenswrapper[4909]: I1002 19:36:33.304944 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerDied","Data":"c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195"} Oct 02 19:36:33 crc kubenswrapper[4909]: I1002 19:36:33.308298 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerStarted","Data":"4fb03865ca4783a7a0a1ec7e5ebe11008a7e772c957f95d9710c552a11adba1b"} Oct 02 19:36:34 crc kubenswrapper[4909]: I1002 19:36:34.325969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerStarted","Data":"e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7"} Oct 02 19:36:35 crc kubenswrapper[4909]: I1002 19:36:35.341484 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerID="e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7" exitCode=0 Oct 02 19:36:35 crc kubenswrapper[4909]: I1002 19:36:35.341543 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerDied","Data":"e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7"} Oct 02 19:36:36 crc kubenswrapper[4909]: I1002 19:36:36.357488 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerStarted","Data":"fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd"} Oct 02 19:36:36 crc kubenswrapper[4909]: I1002 19:36:36.383806 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdjh2" podStartSLOduration=2.9196436390000002 podStartE2EDuration="5.383772173s" podCreationTimestamp="2025-10-02 19:36:31 +0000 UTC" firstStartedPulling="2025-10-02 19:36:33.30829027 +0000 UTC m=+4714.495786129" lastFinishedPulling="2025-10-02 19:36:35.772418754 +0000 UTC m=+4716.959914663" observedRunningTime="2025-10-02 19:36:36.38041326 +0000 UTC m=+4717.567909119" watchObservedRunningTime="2025-10-02 19:36:36.383772173 +0000 UTC m=+4717.571268092" Oct 02 19:36:41 crc kubenswrapper[4909]: I1002 19:36:41.791812 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:41 crc kubenswrapper[4909]: I1002 19:36:41.793877 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:41 crc kubenswrapper[4909]: I1002 19:36:41.846805 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:42 crc kubenswrapper[4909]: I1002 19:36:42.508741 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:42 crc kubenswrapper[4909]: I1002 19:36:42.561552 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdjh2"] Oct 02 19:36:44 crc kubenswrapper[4909]: I1002 19:36:44.452094 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdjh2" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="registry-server" containerID="cri-o://fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd" gracePeriod=2 Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.057262 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.181720 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8qxl\" (UniqueName: \"kubernetes.io/projected/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-kube-api-access-l8qxl\") pod \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.181829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-catalog-content\") pod \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.181898 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-utilities\") pod \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\" (UID: \"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53\") " Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.183159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-utilities" (OuterVolumeSpecName: "utilities") pod "ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" (UID: "ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.230663 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" (UID: "ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.284537 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.284592 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.468226 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerID="fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd" exitCode=0 Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.468268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerDied","Data":"fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd"} Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.468301 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdjh2" event={"ID":"ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53","Type":"ContainerDied","Data":"4fb03865ca4783a7a0a1ec7e5ebe11008a7e772c957f95d9710c552a11adba1b"} Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.468308 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdjh2" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.468323 4909 scope.go:117] "RemoveContainer" containerID="fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.493996 4909 scope.go:117] "RemoveContainer" containerID="e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.883418 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-kube-api-access-l8qxl" (OuterVolumeSpecName: "kube-api-access-l8qxl") pod "ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" (UID: "ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53"). InnerVolumeSpecName "kube-api-access-l8qxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.899407 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8qxl\" (UniqueName: \"kubernetes.io/projected/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53-kube-api-access-l8qxl\") on node \"crc\" DevicePath \"\"" Oct 02 19:36:45 crc kubenswrapper[4909]: I1002 19:36:45.928479 4909 scope.go:117] "RemoveContainer" containerID="c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.028045 4909 scope.go:117] "RemoveContainer" containerID="fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd" Oct 02 19:36:46 crc kubenswrapper[4909]: E1002 19:36:46.028831 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd\": container with ID starting with fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd not found: ID does not exist" containerID="fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.028887 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd"} err="failed to get container status \"fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd\": rpc error: code = NotFound desc = could not find container \"fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd\": container with ID starting with fb31d6e41c36522cb93cb739e88bf934bdfcdaf655bd40dbd08752fad7a22bdd not found: ID does not exist" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.028923 4909 scope.go:117] "RemoveContainer" containerID="e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7" Oct 02 19:36:46 crc kubenswrapper[4909]: E1002 19:36:46.029595 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7\": container with ID starting with e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7 not found: ID does not exist" containerID="e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.029630 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7"} err="failed to get container status \"e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7\": rpc error: code = NotFound desc = could not find container \"e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7\": container with ID starting with e2a482d24d5172ae58a48fecfb949cb91bb1b655a9b1bc4c56ec63e40cade1d7 not found: ID does not exist" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.029653 4909 scope.go:117] "RemoveContainer" containerID="c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195" Oct 02 19:36:46 crc kubenswrapper[4909]: E1002 19:36:46.029972 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195\": container with ID starting with c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195 not found: ID does not exist" containerID="c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.029993 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195"} err="failed to get container status \"c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195\": rpc error: code = NotFound desc = could not find container \"c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195\": container with ID starting with c571441a2a7c99c9cbce544e1f236c2d44df7e58771362758dc39000f0c17195 not found: ID does not exist" Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.253622 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdjh2"] Oct 02 19:36:46 crc kubenswrapper[4909]: I1002 19:36:46.271049 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdjh2"] Oct 02 19:36:47 crc kubenswrapper[4909]: I1002 19:36:47.620264 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" path="/var/lib/kubelet/pods/ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53/volumes" Oct 02 19:38:09 crc kubenswrapper[4909]: I1002 19:38:09.553579 4909 generic.go:334] "Generic (PLEG): container finished" podID="e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" containerID="c1b860631f75d77c4a4593b3d52296e4335bdd29689edabb5a04eaadb20d37ce" exitCode=0 Oct 02 19:38:09 crc kubenswrapper[4909]: I1002 19:38:09.553676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" event={"ID":"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7","Type":"ContainerDied","Data":"c1b860631f75d77c4a4593b3d52296e4335bdd29689edabb5a04eaadb20d37ce"} Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.104968 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.134862 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-0\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.134956 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8txm\" (UniqueName: \"kubernetes.io/projected/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-kube-api-access-n8txm\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.135145 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.135204 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-1\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.135281 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-2\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.135318 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceph\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.135400 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ssh-key\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.135476 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-inventory\") pod \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\" (UID: \"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7\") " Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.160688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.169502 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-kube-api-access-n8txm" (OuterVolumeSpecName: "kube-api-access-n8txm") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "kube-api-access-n8txm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.169719 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceph" (OuterVolumeSpecName: "ceph") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.185958 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.189350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.198292 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.201149 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.213831 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-inventory" (OuterVolumeSpecName: "inventory") pod "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" (UID: "e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.244989 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245042 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8txm\" (UniqueName: \"kubernetes.io/projected/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-kube-api-access-n8txm\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245057 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245074 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245087 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245099 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245111 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.245123 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.598303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" event={"ID":"e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7","Type":"ContainerDied","Data":"943d946159f22f1369702e36b95cbe6c2e5ed489eb3e107fc630f7e90c04276c"} Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.598347 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943d946159f22f1369702e36b95cbe6c2e5ed489eb3e107fc630f7e90c04276c" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.598369 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.727444 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl"] Oct 02 19:38:11 crc kubenswrapper[4909]: E1002 19:38:11.728602 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="registry-server" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.728707 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="registry-server" Oct 02 19:38:11 crc kubenswrapper[4909]: E1002 19:38:11.728803 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="extract-content" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.728878 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="extract-content" Oct 02 19:38:11 crc kubenswrapper[4909]: E1002 19:38:11.728978 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.729078 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 02 19:38:11 crc kubenswrapper[4909]: E1002 19:38:11.729165 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="extract-utilities" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.729238 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="extract-utilities" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.729562 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4749b7-b05c-4dd8-abbc-b6da3a1b7d53" containerName="registry-server" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.730634 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.731651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.735250 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.735627 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.735734 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.736083 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7pxn7" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.736224 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.736470 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.739541 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl"] Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.857611 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.857671 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjh6\" (UniqueName: \"kubernetes.io/projected/524a4d41-71b9-4a01-8b0a-37c2748a79a2-kube-api-access-mgjh6\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.857751 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.857814 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.857901 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.858009 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.960300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.960373 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.960426 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.960446 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjh6\" (UniqueName: \"kubernetes.io/projected/524a4d41-71b9-4a01-8b0a-37c2748a79a2-kube-api-access-mgjh6\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.960489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.960527 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.966200 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.966581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.966936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.967132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.967568 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:11 crc kubenswrapper[4909]: I1002 19:38:11.985201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjh6\" (UniqueName: \"kubernetes.io/projected/524a4d41-71b9-4a01-8b0a-37c2748a79a2-kube-api-access-mgjh6\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5cjpl\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:12 crc kubenswrapper[4909]: I1002 19:38:12.050071 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:12 crc kubenswrapper[4909]: I1002 19:38:12.636097 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl"] Oct 02 19:38:13 crc kubenswrapper[4909]: I1002 19:38:13.623745 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" event={"ID":"524a4d41-71b9-4a01-8b0a-37c2748a79a2","Type":"ContainerStarted","Data":"68b9716100e521bb4f71f945eb07763a3dce999004af717aba3d05da0bc2747b"} Oct 02 19:38:14 crc kubenswrapper[4909]: I1002 19:38:14.634192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" event={"ID":"524a4d41-71b9-4a01-8b0a-37c2748a79a2","Type":"ContainerStarted","Data":"334bae5f54a4362328fae162f39b02bb2422b13bfe35ef4c55433762f8735535"} Oct 02 19:38:14 crc kubenswrapper[4909]: I1002 19:38:14.663915 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" podStartSLOduration=3.211425786 podStartE2EDuration="3.663888797s" podCreationTimestamp="2025-10-02 19:38:11 +0000 UTC" firstStartedPulling="2025-10-02 19:38:12.647448624 +0000 UTC m=+4813.834944493" lastFinishedPulling="2025-10-02 19:38:13.099911645 +0000 UTC m=+4814.287407504" observedRunningTime="2025-10-02 19:38:14.655570298 +0000 UTC m=+4815.843066167" watchObservedRunningTime="2025-10-02 19:38:14.663888797 +0000 UTC m=+4815.851384666" Oct 02 19:38:30 crc kubenswrapper[4909]: I1002 19:38:30.811788 4909 generic.go:334] "Generic (PLEG): container finished" podID="524a4d41-71b9-4a01-8b0a-37c2748a79a2" containerID="334bae5f54a4362328fae162f39b02bb2422b13bfe35ef4c55433762f8735535" exitCode=0 Oct 02 19:38:30 crc kubenswrapper[4909]: I1002 19:38:30.811937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" event={"ID":"524a4d41-71b9-4a01-8b0a-37c2748a79a2","Type":"ContainerDied","Data":"334bae5f54a4362328fae162f39b02bb2422b13bfe35ef4c55433762f8735535"} Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.304201 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.418260 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-inventory\") pod \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.418351 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-1\") pod \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.418453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ssh-key\") pod \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.418602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ceph\") pod \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.418630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-0\") pod \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.418666 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgjh6\" (UniqueName: \"kubernetes.io/projected/524a4d41-71b9-4a01-8b0a-37c2748a79a2-kube-api-access-mgjh6\") pod \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\" (UID: \"524a4d41-71b9-4a01-8b0a-37c2748a79a2\") " Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.460204 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ceph" (OuterVolumeSpecName: "ceph") pod "524a4d41-71b9-4a01-8b0a-37c2748a79a2" (UID: "524a4d41-71b9-4a01-8b0a-37c2748a79a2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.467210 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524a4d41-71b9-4a01-8b0a-37c2748a79a2-kube-api-access-mgjh6" (OuterVolumeSpecName: "kube-api-access-mgjh6") pod "524a4d41-71b9-4a01-8b0a-37c2748a79a2" (UID: "524a4d41-71b9-4a01-8b0a-37c2748a79a2"). InnerVolumeSpecName "kube-api-access-mgjh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.524901 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.524934 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgjh6\" (UniqueName: \"kubernetes.io/projected/524a4d41-71b9-4a01-8b0a-37c2748a79a2-kube-api-access-mgjh6\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.531313 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-inventory" (OuterVolumeSpecName: "inventory") pod "524a4d41-71b9-4a01-8b0a-37c2748a79a2" (UID: "524a4d41-71b9-4a01-8b0a-37c2748a79a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.543187 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "524a4d41-71b9-4a01-8b0a-37c2748a79a2" (UID: "524a4d41-71b9-4a01-8b0a-37c2748a79a2"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.581189 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "524a4d41-71b9-4a01-8b0a-37c2748a79a2" (UID: "524a4d41-71b9-4a01-8b0a-37c2748a79a2"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.587709 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "524a4d41-71b9-4a01-8b0a-37c2748a79a2" (UID: "524a4d41-71b9-4a01-8b0a-37c2748a79a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.627331 4909 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.627361 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.627373 4909 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.627382 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/524a4d41-71b9-4a01-8b0a-37c2748a79a2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.837328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" event={"ID":"524a4d41-71b9-4a01-8b0a-37c2748a79a2","Type":"ContainerDied","Data":"68b9716100e521bb4f71f945eb07763a3dce999004af717aba3d05da0bc2747b"} Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.837627 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b9716100e521bb4f71f945eb07763a3dce999004af717aba3d05da0bc2747b" Oct 02 19:38:32 crc kubenswrapper[4909]: I1002 19:38:32.837578 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5cjpl" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.858967 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 19:38:48 crc kubenswrapper[4909]: E1002 19:38:48.860101 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524a4d41-71b9-4a01-8b0a-37c2748a79a2" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.860123 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="524a4d41-71b9-4a01-8b0a-37c2748a79a2" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.860443 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="524a4d41-71b9-4a01-8b0a-37c2748a79a2" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.861943 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.866316 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.866507 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.869274 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.904536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.904591 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-dev\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.904696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.904771 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.904824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.904910 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905087 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-sys\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905132 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905165 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4bx\" (UniqueName: \"kubernetes.io/projected/2d392c07-e417-4d0d-b301-adf410105519-kube-api-access-wt4bx\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905205 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905373 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905414 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-run\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905519 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.905633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d392c07-e417-4d0d-b301-adf410105519-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.945331 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.948872 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.952163 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 02 19:38:48 crc kubenswrapper[4909]: I1002 19:38:48.968166 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-scripts\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013220 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013279 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-config-data\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6d132d6b-b137-48eb-8ccd-39354ec83056-ceph\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-run\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013380 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013403 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013428 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-dev\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013500 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d392c07-e417-4d0d-b301-adf410105519-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013533 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62qs\" (UniqueName: \"kubernetes.io/projected/6d132d6b-b137-48eb-8ccd-39354ec83056-kube-api-access-l62qs\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013571 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013621 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-sys\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013643 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-dev\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013747 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013777 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013805 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013892 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.013992 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-run\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014022 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-sys\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4bx\" (UniqueName: \"kubernetes.io/projected/2d392c07-e417-4d0d-b301-adf410105519-kube-api-access-wt4bx\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-lib-modules\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014638 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.014740 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-run\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.015195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-dev\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.015351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-sys\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.016337 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.016650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.016724 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.016800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.016883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2d392c07-e417-4d0d-b301-adf410105519-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.021565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d392c07-e417-4d0d-b301-adf410105519-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.021708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.022293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.035361 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.036467 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d392c07-e417-4d0d-b301-adf410105519-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.037016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4bx\" (UniqueName: \"kubernetes.io/projected/2d392c07-e417-4d0d-b301-adf410105519-kube-api-access-wt4bx\") pod \"cinder-volume-volume1-0\" (UID: \"2d392c07-e417-4d0d-b301-adf410105519\") " pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115512 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115627 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-run\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115667 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-lib-modules\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115659 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-scripts\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115795 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-config-data\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115858 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6d132d6b-b137-48eb-8ccd-39354ec83056-ceph\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.115982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-dev\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116065 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116150 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62qs\" (UniqueName: \"kubernetes.io/projected/6d132d6b-b137-48eb-8ccd-39354ec83056-kube-api-access-l62qs\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116221 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116257 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116289 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-sys\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116333 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.116605 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-dev\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.118222 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-run\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.118372 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.118408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-lib-modules\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.123181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.123702 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.123798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.124302 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.124358 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d132d6b-b137-48eb-8ccd-39354ec83056-sys\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.126484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-scripts\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.129338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-config-data\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.129442 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.133737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d132d6b-b137-48eb-8ccd-39354ec83056-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.137317 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6d132d6b-b137-48eb-8ccd-39354ec83056-ceph\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.141861 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62qs\" (UniqueName: \"kubernetes.io/projected/6d132d6b-b137-48eb-8ccd-39354ec83056-kube-api-access-l62qs\") pod \"cinder-backup-0\" (UID: \"6d132d6b-b137-48eb-8ccd-39354ec83056\") " pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.190594 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.299398 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.754321 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-szqm7"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.755988 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-szqm7" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.773525 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-szqm7"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.784353 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.786377 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.788966 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.789244 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.789388 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.789509 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-54748" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.810462 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.823085 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847131 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847233 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847374 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847431 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf9k9\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-kube-api-access-sf9k9\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847458 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847547 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-logs\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-ceph\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847738 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847761 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.847839 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5wn4\" (UniqueName: \"kubernetes.io/projected/8700ec28-b969-413d-8c1b-40f72280eea2-kube-api-access-f5wn4\") pod \"manila-db-create-szqm7\" (UID: \"8700ec28-b969-413d-8c1b-40f72280eea2\") " pod="openstack/manila-db-create-szqm7" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.855314 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.856438 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.904579 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952359 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v92n5\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-kube-api-access-v92n5\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952464 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-ceph\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952486 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952575 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5wn4\" (UniqueName: \"kubernetes.io/projected/8700ec28-b969-413d-8c1b-40f72280eea2-kube-api-access-f5wn4\") pod \"manila-db-create-szqm7\" (UID: \"8700ec28-b969-413d-8c1b-40f72280eea2\") " pod="openstack/manila-db-create-szqm7" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952596 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952628 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952661 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952795 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952827 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf9k9\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-kube-api-access-sf9k9\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952852 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952882 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952916 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-logs\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.952935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-logs\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.960673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.963193 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-logs\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.971078 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.971310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.971367 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-655568d68f-86d8j"] Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.971935 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.972593 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.973166 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.978869 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.981853 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pjfph" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.982171 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.993485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-ceph\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.994107 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.995256 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:49 crc kubenswrapper[4909]: I1002 19:38:49.998409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf9k9\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-kube-api-access-sf9k9\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.014807 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5wn4\" (UniqueName: \"kubernetes.io/projected/8700ec28-b969-413d-8c1b-40f72280eea2-kube-api-access-f5wn4\") pod \"manila-db-create-szqm7\" (UID: \"8700ec28-b969-413d-8c1b-40f72280eea2\") " pod="openstack/manila-db-create-szqm7" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.025006 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:50 crc kubenswrapper[4909]: E1002 19:38:50.038280 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="ed36fc2a-b8f0-4a50-b616-a34f438a6691" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.056758 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-655568d68f-86d8j"] Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.062509 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt42d\" (UniqueName: \"kubernetes.io/projected/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-kube-api-access-xt42d\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.062604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.062722 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.062813 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.062971 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063047 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-scripts\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063147 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063288 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-config-data\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-logs\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-logs\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063476 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v92n5\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-kube-api-access-v92n5\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.063502 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-horizon-secret-key\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.064663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.067705 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.067963 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.068815 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-logs\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.072654 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.073424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.080546 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.081015 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-szqm7" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.082697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.083141 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.137336 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.137386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"2d392c07-e417-4d0d-b301-adf410105519","Type":"ContainerStarted","Data":"50d4d8b37bc27e9b03b302b0314c40854e13efcf48ef2934fddd8991d52a48dc"} Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.185576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v92n5\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-kube-api-access-v92n5\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.221071 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-scripts\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.221324 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-config-data\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.221458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-logs\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.221520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-horizon-secret-key\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.221777 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt42d\" (UniqueName: \"kubernetes.io/projected/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-kube-api-access-xt42d\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.226570 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.238822 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-logs\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.242040 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-config-data\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: E1002 19:38:50.244301 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="a953a5ba-5405-42c3-ac6c-1138d24c7d80" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.253691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-scripts\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.297095 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt42d\" (UniqueName: \"kubernetes.io/projected/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-kube-api-access-xt42d\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.373380 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-horizon-secret-key\") pod \"horizon-655568d68f-86d8j\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.391599 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8dbbb67d5-fq6b9"] Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.401512 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.410936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.438748 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.455744 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8dbbb67d5-fq6b9"] Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480495 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-combined-ca-bundle\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480583 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-public-tls-certs\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480609 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-logs\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-httpd-run\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480686 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-scripts\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf9k9\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-kube-api-access-sf9k9\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480762 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-config-data\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480798 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-ceph\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.480842 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\" (UID: \"ed36fc2a-b8f0-4a50-b616-a34f438a6691\") " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.481008 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-config-data\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.481247 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00ac5257-bc64-4a28-92a8-13b68c074824-horizon-secret-key\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.481320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ac5257-bc64-4a28-92a8-13b68c074824-logs\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.481348 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgtq\" (UniqueName: \"kubernetes.io/projected/00ac5257-bc64-4a28-92a8-13b68c074824-kube-api-access-8qgtq\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.481409 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-scripts\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.486084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.498875 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.499085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-logs" (OuterVolumeSpecName: "logs") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.505933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-config-data" (OuterVolumeSpecName: "config-data") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.514883 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-ceph" (OuterVolumeSpecName: "ceph") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.517198 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-scripts" (OuterVolumeSpecName: "scripts") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.517225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.535327 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.545840 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.546398 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-kube-api-access-sf9k9" (OuterVolumeSpecName: "kube-api-access-sf9k9") pod "ed36fc2a-b8f0-4a50-b616-a34f438a6691" (UID: "ed36fc2a-b8f0-4a50-b616-a34f438a6691"). InnerVolumeSpecName "kube-api-access-sf9k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ac5257-bc64-4a28-92a8-13b68c074824-logs\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588640 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgtq\" (UniqueName: \"kubernetes.io/projected/00ac5257-bc64-4a28-92a8-13b68c074824-kube-api-access-8qgtq\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-scripts\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-config-data\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00ac5257-bc64-4a28-92a8-13b68c074824-horizon-secret-key\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588880 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588893 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588901 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed36fc2a-b8f0-4a50-b616-a34f438a6691-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588911 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588920 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf9k9\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-kube-api-access-sf9k9\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588931 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588939 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed36fc2a-b8f0-4a50-b616-a34f438a6691-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588960 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.588969 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36fc2a-b8f0-4a50-b616-a34f438a6691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.591213 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ac5257-bc64-4a28-92a8-13b68c074824-logs\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.594882 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-config-data\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.595205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-scripts\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.597585 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00ac5257-bc64-4a28-92a8-13b68c074824-horizon-secret-key\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.609843 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgtq\" (UniqueName: \"kubernetes.io/projected/00ac5257-bc64-4a28-92a8-13b68c074824-kube-api-access-8qgtq\") pod \"horizon-8dbbb67d5-fq6b9\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.611865 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.658297 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.691854 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.744651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:38:50 crc kubenswrapper[4909]: I1002 19:38:50.924219 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-szqm7"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.116127 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-655568d68f-86d8j"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.179320 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6d132d6b-b137-48eb-8ccd-39354ec83056","Type":"ContainerStarted","Data":"a70dbcdcf89db03bf30841f70de55edc4dea18ffca1c9c05e1a963bcba29a4c4"} Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.182804 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655568d68f-86d8j" event={"ID":"b9313bee-1007-427f-9a6a-fd93f7c4aa5b","Type":"ContainerStarted","Data":"325c57c96b3a2a09a1e5b98b681e3e743e20ce38bd1b72c6a0aff5e04636d978"} Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.185551 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.188486 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-szqm7" event={"ID":"8700ec28-b969-413d-8c1b-40f72280eea2","Type":"ContainerStarted","Data":"f205d51d68196cce3d23dc4b962b74aac8928c0eeca13c6b420c956a49a0f835"} Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.188570 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.335552 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8dbbb67d5-fq6b9"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.349633 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.407893 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.426518 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.436691 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.438782 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.442620 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.443059 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.450332 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-config-data\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512626 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-internal-tls-certs\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512744 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-ceph\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512841 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512888 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v92n5\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-kube-api-access-v92n5\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-httpd-run\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.512980 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-logs\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.513012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-combined-ca-bundle\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.513050 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-scripts\") pod \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\" (UID: \"a953a5ba-5405-42c3-ac6c-1138d24c7d80\") " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.514139 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-logs" (OuterVolumeSpecName: "logs") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.514354 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.518408 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-scripts" (OuterVolumeSpecName: "scripts") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.518500 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-config-data" (OuterVolumeSpecName: "config-data") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.520266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.520366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.520409 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-kube-api-access-v92n5" (OuterVolumeSpecName: "kube-api-access-v92n5") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "kube-api-access-v92n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.520767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.520842 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-ceph" (OuterVolumeSpecName: "ceph") pod "a953a5ba-5405-42c3-ac6c-1138d24c7d80" (UID: "a953a5ba-5405-42c3-ac6c-1138d24c7d80"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.615403 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616108 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616177 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgmn5\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-kube-api-access-lgmn5\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616203 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616266 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-logs\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616506 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616553 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616644 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616658 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616671 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616683 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616693 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a953a5ba-5405-42c3-ac6c-1138d24c7d80-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616705 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616726 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616738 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v92n5\" (UniqueName: \"kubernetes.io/projected/a953a5ba-5405-42c3-ac6c-1138d24c7d80-kube-api-access-v92n5\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.616749 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a953a5ba-5405-42c3-ac6c-1138d24c7d80-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.626568 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed36fc2a-b8f0-4a50-b616-a34f438a6691" path="/var/lib/kubelet/pods/ed36fc2a-b8f0-4a50-b616-a34f438a6691/volumes" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.649135 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.718760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719057 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719154 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719279 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgmn5\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-kube-api-access-lgmn5\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719305 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719395 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-logs\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.719488 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.720766 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.722444 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.722663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-logs\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.725960 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.726139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.727607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.730323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.731760 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.746470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgmn5\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-kube-api-access-lgmn5\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:51 crc kubenswrapper[4909]: I1002 19:38:51.772887 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.064889 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.200923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dbbb67d5-fq6b9" event={"ID":"00ac5257-bc64-4a28-92a8-13b68c074824","Type":"ContainerStarted","Data":"ee6087853981a49da837142c955cc8b1afd40de02799f2da3ccdd2315a9f96d8"} Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.202426 4909 generic.go:334] "Generic (PLEG): container finished" podID="8700ec28-b969-413d-8c1b-40f72280eea2" containerID="8d06acbf2ef8582f76d415d0a8d9b9e002819efa143f4d392d6744cfe968c910" exitCode=0 Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.202505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-szqm7" event={"ID":"8700ec28-b969-413d-8c1b-40f72280eea2","Type":"ContainerDied","Data":"8d06acbf2ef8582f76d415d0a8d9b9e002819efa143f4d392d6744cfe968c910"} Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.204449 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6d132d6b-b137-48eb-8ccd-39354ec83056","Type":"ContainerStarted","Data":"3ed14582d52a04a5a4bde65d4609c004de3c3363601e73f6bbe67e82ab5ba1d2"} Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.204472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6d132d6b-b137-48eb-8ccd-39354ec83056","Type":"ContainerStarted","Data":"9f0c7e3fc7f1600be9e7e973d4dc0f85974f9da8d91e7083bd9e0430b536f558"} Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.222677 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.224415 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"2d392c07-e417-4d0d-b301-adf410105519","Type":"ContainerStarted","Data":"aa88468db71c9e387052ae884094d39d9171f8e06e4a0a38d324f370a26ee854"} Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.224455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"2d392c07-e417-4d0d-b301-adf410105519","Type":"ContainerStarted","Data":"3943986165a999085ba457f510de80c09cb9b81fa581f2e11167ff07c1c68d75"} Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.262738 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.125836212 podStartE2EDuration="4.262719113s" podCreationTimestamp="2025-10-02 19:38:48 +0000 UTC" firstStartedPulling="2025-10-02 19:38:50.397253825 +0000 UTC m=+4851.584749684" lastFinishedPulling="2025-10-02 19:38:51.534136726 +0000 UTC m=+4852.721632585" observedRunningTime="2025-10-02 19:38:52.253614769 +0000 UTC m=+4853.441110648" watchObservedRunningTime="2025-10-02 19:38:52.262719113 +0000 UTC m=+4853.450214972" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.295349 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.114368164 podStartE2EDuration="4.295331289s" podCreationTimestamp="2025-10-02 19:38:48 +0000 UTC" firstStartedPulling="2025-10-02 19:38:49.973443107 +0000 UTC m=+4851.160938966" lastFinishedPulling="2025-10-02 19:38:51.154406242 +0000 UTC m=+4852.341902091" observedRunningTime="2025-10-02 19:38:52.280471516 +0000 UTC m=+4853.467967375" watchObservedRunningTime="2025-10-02 19:38:52.295331289 +0000 UTC m=+4853.482827148" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.373131 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.425423 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.437096 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.438871 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.447209 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.458897 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.459554 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.558454 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.558501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-logs\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.558519 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.558887 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.559078 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.559218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.559276 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.559308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.559332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldz79\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-kube-api-access-ldz79\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.661812 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.661880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.661908 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.661931 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.661952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldz79\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-kube-api-access-ldz79\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.662011 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.662051 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-logs\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.662066 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.662129 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.662440 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.666398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-logs\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.667070 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.672670 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.673332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.677580 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.684560 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.689813 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.699624 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldz79\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-kube-api-access-ldz79\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.782733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.787420 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.833586 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-655568d68f-86d8j"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.870827 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54b6f8f8f6-vtk9c"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.872989 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.883240 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.912656 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54b6f8f8f6-vtk9c"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.943137 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8dbbb67d5-fq6b9"] Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980185 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-combined-ca-bundle\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-secret-key\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980287 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff545e92-3c9d-45b8-9438-89122eeef32f-logs\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kgv\" (UniqueName: \"kubernetes.io/projected/ff545e92-3c9d-45b8-9438-89122eeef32f-kube-api-access-k7kgv\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-config-data\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980391 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-scripts\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.980421 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-tls-certs\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:52 crc kubenswrapper[4909]: I1002 19:38:52.985288 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.040098 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c7d49444-5plj4"] Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.042361 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.050725 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c7d49444-5plj4"] Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.073362 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.073410 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085624 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff545e92-3c9d-45b8-9438-89122eeef32f-logs\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kgv\" (UniqueName: \"kubernetes.io/projected/ff545e92-3c9d-45b8-9438-89122eeef32f-kube-api-access-k7kgv\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-config-data\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085733 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-config-data\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085769 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-scripts\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085804 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-tls-certs\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085831 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-horizon-secret-key\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085853 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-logs\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-horizon-tls-certs\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085881 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-scripts\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-combined-ca-bundle\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085977 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85lbq\" (UniqueName: \"kubernetes.io/projected/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-kube-api-access-85lbq\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.085996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-combined-ca-bundle\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.086013 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-secret-key\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.086934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff545e92-3c9d-45b8-9438-89122eeef32f-logs\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.089712 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-scripts\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.089733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-config-data\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.095922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-secret-key\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.099337 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-combined-ca-bundle\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.109513 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-tls-certs\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.128541 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.184997 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kgv\" (UniqueName: \"kubernetes.io/projected/ff545e92-3c9d-45b8-9438-89122eeef32f-kube-api-access-k7kgv\") pod \"horizon-54b6f8f8f6-vtk9c\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-horizon-secret-key\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-logs\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-horizon-tls-certs\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-scripts\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187728 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85lbq\" (UniqueName: \"kubernetes.io/projected/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-kube-api-access-85lbq\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187762 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-combined-ca-bundle\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.187844 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-config-data\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.192241 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-logs\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.197444 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-horizon-secret-key\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.200176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-config-data\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.202204 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-scripts\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.204288 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-horizon-tls-certs\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.234258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-combined-ca-bundle\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.247239 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.272728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd44a40e-51c2-4592-973d-654faf5f6849","Type":"ContainerStarted","Data":"e7577fe2f15170cadb0f7376dab1bf95288c78f414b5b45f89df203faaa07936"} Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.288008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85lbq\" (UniqueName: \"kubernetes.io/projected/a982efe8-dc2e-4706-bfd3-0b14dbd266cf-kube-api-access-85lbq\") pod \"horizon-8c7d49444-5plj4\" (UID: \"a982efe8-dc2e-4706-bfd3-0b14dbd266cf\") " pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.297813 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.412440 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:38:53 crc kubenswrapper[4909]: I1002 19:38:53.631091 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a953a5ba-5405-42c3-ac6c-1138d24c7d80" path="/var/lib/kubelet/pods/a953a5ba-5405-42c3-ac6c-1138d24c7d80/volumes" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.019351 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-szqm7" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.045558 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.055373 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54b6f8f8f6-vtk9c"] Oct 02 19:38:54 crc kubenswrapper[4909]: W1002 19:38:54.088365 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff545e92_3c9d_45b8_9438_89122eeef32f.slice/crio-0cd87e955892847bfb746dab46d293b8570bf6530b620197c63e8a4f2df0c6db WatchSource:0}: Error finding container 0cd87e955892847bfb746dab46d293b8570bf6530b620197c63e8a4f2df0c6db: Status 404 returned error can't find the container with id 0cd87e955892847bfb746dab46d293b8570bf6530b620197c63e8a4f2df0c6db Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.117416 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5wn4\" (UniqueName: \"kubernetes.io/projected/8700ec28-b969-413d-8c1b-40f72280eea2-kube-api-access-f5wn4\") pod \"8700ec28-b969-413d-8c1b-40f72280eea2\" (UID: \"8700ec28-b969-413d-8c1b-40f72280eea2\") " Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.128252 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8700ec28-b969-413d-8c1b-40f72280eea2-kube-api-access-f5wn4" (OuterVolumeSpecName: "kube-api-access-f5wn4") pod "8700ec28-b969-413d-8c1b-40f72280eea2" (UID: "8700ec28-b969-413d-8c1b-40f72280eea2"). InnerVolumeSpecName "kube-api-access-f5wn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.191528 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.222650 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5wn4\" (UniqueName: \"kubernetes.io/projected/8700ec28-b969-413d-8c1b-40f72280eea2-kube-api-access-f5wn4\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.290192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e78f386a-5f63-4ec7-b23b-a258a0b80bcb","Type":"ContainerStarted","Data":"1645a9121102e225782cdc5cbf81b248aafab77be772f88e8c717a828a0c6a90"} Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.301480 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.308805 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-szqm7" event={"ID":"8700ec28-b969-413d-8c1b-40f72280eea2","Type":"ContainerDied","Data":"f205d51d68196cce3d23dc4b962b74aac8928c0eeca13c6b420c956a49a0f835"} Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.308836 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-szqm7" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.308840 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f205d51d68196cce3d23dc4b962b74aac8928c0eeca13c6b420c956a49a0f835" Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.313251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54b6f8f8f6-vtk9c" event={"ID":"ff545e92-3c9d-45b8-9438-89122eeef32f","Type":"ContainerStarted","Data":"0cd87e955892847bfb746dab46d293b8570bf6530b620197c63e8a4f2df0c6db"} Oct 02 19:38:54 crc kubenswrapper[4909]: I1002 19:38:54.593491 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c7d49444-5plj4"] Oct 02 19:38:55 crc kubenswrapper[4909]: I1002 19:38:55.323221 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c7d49444-5plj4" event={"ID":"a982efe8-dc2e-4706-bfd3-0b14dbd266cf","Type":"ContainerStarted","Data":"9f312929b8b64016612131de109c8ba1dc4e7cd574d842bfff79638c10ee8f0e"} Oct 02 19:38:55 crc kubenswrapper[4909]: I1002 19:38:55.330280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd44a40e-51c2-4592-973d-654faf5f6849","Type":"ContainerStarted","Data":"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d"} Oct 02 19:38:55 crc kubenswrapper[4909]: I1002 19:38:55.332524 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e78f386a-5f63-4ec7-b23b-a258a0b80bcb","Type":"ContainerStarted","Data":"12b2a6d626030312619b017d92a1fb7006935d6689bbafa79bf28025a9d549f4"} Oct 02 19:38:56 crc kubenswrapper[4909]: I1002 19:38:56.354217 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd44a40e-51c2-4592-973d-654faf5f6849","Type":"ContainerStarted","Data":"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b"} Oct 02 19:38:56 crc kubenswrapper[4909]: I1002 19:38:56.354695 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-log" containerID="cri-o://87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d" gracePeriod=30 Oct 02 19:38:56 crc kubenswrapper[4909]: I1002 19:38:56.355140 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-httpd" containerID="cri-o://61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b" gracePeriod=30 Oct 02 19:38:56 crc kubenswrapper[4909]: I1002 19:38:56.384604 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.384582073 podStartE2EDuration="5.384582073s" podCreationTimestamp="2025-10-02 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:38:56.376421478 +0000 UTC m=+4857.563917337" watchObservedRunningTime="2025-10-02 19:38:56.384582073 +0000 UTC m=+4857.572077932" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.284769 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.374414 4909 generic.go:334] "Generic (PLEG): container finished" podID="bd44a40e-51c2-4592-973d-654faf5f6849" containerID="61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b" exitCode=143 Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.374448 4909 generic.go:334] "Generic (PLEG): container finished" podID="bd44a40e-51c2-4592-973d-654faf5f6849" containerID="87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d" exitCode=143 Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.374572 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.375393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd44a40e-51c2-4592-973d-654faf5f6849","Type":"ContainerDied","Data":"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b"} Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.375419 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd44a40e-51c2-4592-973d-654faf5f6849","Type":"ContainerDied","Data":"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d"} Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.375429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd44a40e-51c2-4592-973d-654faf5f6849","Type":"ContainerDied","Data":"e7577fe2f15170cadb0f7376dab1bf95288c78f414b5b45f89df203faaa07936"} Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.375452 4909 scope.go:117] "RemoveContainer" containerID="61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.381996 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e78f386a-5f63-4ec7-b23b-a258a0b80bcb","Type":"ContainerStarted","Data":"78d39831c341f3b7aedc0f1ce087d250f44616af6fe1e5727d1b3d3de647161e"} Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.382222 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-log" containerID="cri-o://12b2a6d626030312619b017d92a1fb7006935d6689bbafa79bf28025a9d549f4" gracePeriod=30 Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.382476 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-httpd" containerID="cri-o://78d39831c341f3b7aedc0f1ce087d250f44616af6fe1e5727d1b3d3de647161e" gracePeriod=30 Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406285 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-httpd-run\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-scripts\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406517 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-config-data\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406535 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-logs\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-combined-ca-bundle\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406586 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgmn5\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-kube-api-access-lgmn5\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406608 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-ceph\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.406696 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-public-tls-certs\") pod \"bd44a40e-51c2-4592-973d-654faf5f6849\" (UID: \"bd44a40e-51c2-4592-973d-654faf5f6849\") " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.408323 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.408636 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-logs" (OuterVolumeSpecName: "logs") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.415567 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.415550664 podStartE2EDuration="5.415550664s" podCreationTimestamp="2025-10-02 19:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:38:57.40001562 +0000 UTC m=+4858.587511479" watchObservedRunningTime="2025-10-02 19:38:57.415550664 +0000 UTC m=+4858.603046523" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.415975 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-scripts" (OuterVolumeSpecName: "scripts") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.419046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.419107 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-kube-api-access-lgmn5" (OuterVolumeSpecName: "kube-api-access-lgmn5") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "kube-api-access-lgmn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.424883 4909 scope.go:117] "RemoveContainer" containerID="87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.437254 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-ceph" (OuterVolumeSpecName: "ceph") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.462718 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510499 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510529 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510538 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a40e-51c2-4592-973d-654faf5f6849-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510546 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510555 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgmn5\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-kube-api-access-lgmn5\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510564 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd44a40e-51c2-4592-973d-654faf5f6849-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.510590 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.522085 4909 scope.go:117] "RemoveContainer" containerID="61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b" Oct 02 19:38:57 crc kubenswrapper[4909]: E1002 19:38:57.522629 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b\": container with ID starting with 61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b not found: ID does not exist" containerID="61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.522680 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b"} err="failed to get container status \"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b\": rpc error: code = NotFound desc = could not find container \"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b\": container with ID starting with 61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b not found: ID does not exist" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.522710 4909 scope.go:117] "RemoveContainer" containerID="87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d" Oct 02 19:38:57 crc kubenswrapper[4909]: E1002 19:38:57.524175 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d\": container with ID starting with 87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d not found: ID does not exist" containerID="87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.524207 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d"} err="failed to get container status \"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d\": rpc error: code = NotFound desc = could not find container \"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d\": container with ID starting with 87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d not found: ID does not exist" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.524232 4909 scope.go:117] "RemoveContainer" containerID="61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.525273 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b"} err="failed to get container status \"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b\": rpc error: code = NotFound desc = could not find container \"61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b\": container with ID starting with 61a5af19737c0066eabade3692879c73c9f1be57acd9fa29d72e87343f899a3b not found: ID does not exist" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.525303 4909 scope.go:117] "RemoveContainer" containerID="87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.525801 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d"} err="failed to get container status \"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d\": rpc error: code = NotFound desc = could not find container \"87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d\": container with ID starting with 87a1e14728d9364d350465a2b74b2a6484645361ecd76325f754d4bb46f9dd9d not found: ID does not exist" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.547905 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.584441 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.584915 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-config-data" (OuterVolumeSpecName: "config-data") pod "bd44a40e-51c2-4592-973d-654faf5f6849" (UID: "bd44a40e-51c2-4592-973d-654faf5f6849"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.611882 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.611912 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd44a40e-51c2-4592-973d-654faf5f6849-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.611925 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.763514 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.773071 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783199 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:57 crc kubenswrapper[4909]: E1002 19:38:57.783661 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-log" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783675 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-log" Oct 02 19:38:57 crc kubenswrapper[4909]: E1002 19:38:57.783684 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-httpd" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783690 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-httpd" Oct 02 19:38:57 crc kubenswrapper[4909]: E1002 19:38:57.783736 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8700ec28-b969-413d-8c1b-40f72280eea2" containerName="mariadb-database-create" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783743 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8700ec28-b969-413d-8c1b-40f72280eea2" containerName="mariadb-database-create" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783940 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8700ec28-b969-413d-8c1b-40f72280eea2" containerName="mariadb-database-create" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783960 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-httpd" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.783978 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" containerName="glance-log" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.785250 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.788416 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.788419 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.793524 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921665 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d359125d-3d0d-4c5d-bacd-c722f9fe3116-ceph\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921808 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h58rd\" (UniqueName: \"kubernetes.io/projected/d359125d-3d0d-4c5d-bacd-c722f9fe3116-kube-api-access-h58rd\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921844 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921862 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-config-data\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921949 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-scripts\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.921967 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d359125d-3d0d-4c5d-bacd-c722f9fe3116-logs\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.922013 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:57 crc kubenswrapper[4909]: I1002 19:38:57.922071 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d359125d-3d0d-4c5d-bacd-c722f9fe3116-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h58rd\" (UniqueName: \"kubernetes.io/projected/d359125d-3d0d-4c5d-bacd-c722f9fe3116-kube-api-access-h58rd\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-config-data\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-scripts\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024438 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d359125d-3d0d-4c5d-bacd-c722f9fe3116-logs\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024511 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d359125d-3d0d-4c5d-bacd-c722f9fe3116-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.024582 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d359125d-3d0d-4c5d-bacd-c722f9fe3116-ceph\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.025833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d359125d-3d0d-4c5d-bacd-c722f9fe3116-logs\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.025920 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.026190 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d359125d-3d0d-4c5d-bacd-c722f9fe3116-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.031292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.032927 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-scripts\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.033062 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d359125d-3d0d-4c5d-bacd-c722f9fe3116-ceph\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.033680 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.034266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d359125d-3d0d-4c5d-bacd-c722f9fe3116-config-data\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.043880 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h58rd\" (UniqueName: \"kubernetes.io/projected/d359125d-3d0d-4c5d-bacd-c722f9fe3116-kube-api-access-h58rd\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.081325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d359125d-3d0d-4c5d-bacd-c722f9fe3116\") " pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.118528 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.401521 4909 generic.go:334] "Generic (PLEG): container finished" podID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerID="78d39831c341f3b7aedc0f1ce087d250f44616af6fe1e5727d1b3d3de647161e" exitCode=0 Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.401925 4909 generic.go:334] "Generic (PLEG): container finished" podID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerID="12b2a6d626030312619b017d92a1fb7006935d6689bbafa79bf28025a9d549f4" exitCode=143 Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.401683 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e78f386a-5f63-4ec7-b23b-a258a0b80bcb","Type":"ContainerDied","Data":"78d39831c341f3b7aedc0f1ce087d250f44616af6fe1e5727d1b3d3de647161e"} Oct 02 19:38:58 crc kubenswrapper[4909]: I1002 19:38:58.401964 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e78f386a-5f63-4ec7-b23b-a258a0b80bcb","Type":"ContainerDied","Data":"12b2a6d626030312619b017d92a1fb7006935d6689bbafa79bf28025a9d549f4"} Oct 02 19:38:59 crc kubenswrapper[4909]: I1002 19:38:59.431691 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 02 19:38:59 crc kubenswrapper[4909]: I1002 19:38:59.569858 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 02 19:38:59 crc kubenswrapper[4909]: I1002 19:38:59.669452 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd44a40e-51c2-4592-973d-654faf5f6849" path="/var/lib/kubelet/pods/bd44a40e-51c2-4592-973d-654faf5f6849/volumes" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.067310 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154174 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154396 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-combined-ca-bundle\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-logs\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-ceph\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154532 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-httpd-run\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154790 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-scripts\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154926 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-config-data\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154955 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldz79\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-kube-api-access-ldz79\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.154991 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-internal-tls-certs\") pod \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\" (UID: \"e78f386a-5f63-4ec7-b23b-a258a0b80bcb\") " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.156846 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-logs" (OuterVolumeSpecName: "logs") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.159682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.160537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-ceph" (OuterVolumeSpecName: "ceph") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.162320 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.162892 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-scripts" (OuterVolumeSpecName: "scripts") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.165204 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-kube-api-access-ldz79" (OuterVolumeSpecName: "kube-api-access-ldz79") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "kube-api-access-ldz79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.204475 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.239652 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-config-data" (OuterVolumeSpecName: "config-data") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.246295 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e78f386a-5f63-4ec7-b23b-a258a0b80bcb" (UID: "e78f386a-5f63-4ec7-b23b-a258a0b80bcb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264101 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264141 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264151 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264160 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldz79\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-kube-api-access-ldz79\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264170 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264205 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264215 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264223 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.264231 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e78f386a-5f63-4ec7-b23b-a258a0b80bcb-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.285624 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.370360 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.467451 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dbbb67d5-fq6b9" event={"ID":"00ac5257-bc64-4a28-92a8-13b68c074824","Type":"ContainerStarted","Data":"158a6747681309b253213abe46b326875ad719b4905a54dda0b0426985f87bde"} Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.469050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e78f386a-5f63-4ec7-b23b-a258a0b80bcb","Type":"ContainerDied","Data":"1645a9121102e225782cdc5cbf81b248aafab77be772f88e8c717a828a0c6a90"} Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.469098 4909 scope.go:117] "RemoveContainer" containerID="78d39831c341f3b7aedc0f1ce087d250f44616af6fe1e5727d1b3d3de647161e" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.469262 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.507868 4909 scope.go:117] "RemoveContainer" containerID="12b2a6d626030312619b017d92a1fb7006935d6689bbafa79bf28025a9d549f4" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.523970 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.544752 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.557139 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:39:03 crc kubenswrapper[4909]: E1002 19:39:03.558201 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-httpd" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.558281 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-httpd" Oct 02 19:39:03 crc kubenswrapper[4909]: E1002 19:39:03.558372 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-log" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.558392 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-log" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.558967 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-httpd" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.559097 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" containerName="glance-log" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.571316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.583327 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.583503 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.590855 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.633240 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78f386a-5f63-4ec7-b23b-a258a0b80bcb" path="/var/lib/kubelet/pods/e78f386a-5f63-4ec7-b23b-a258a0b80bcb/volumes" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.635061 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.789580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.789913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d71e63f4-8904-4721-8c48-b66216330fc2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790108 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d71e63f4-8904-4721-8c48-b66216330fc2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790131 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790178 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790316 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dmk\" (UniqueName: \"kubernetes.io/projected/d71e63f4-8904-4721-8c48-b66216330fc2-kube-api-access-z8dmk\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790671 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d71e63f4-8904-4721-8c48-b66216330fc2-logs\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.790727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.893821 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d71e63f4-8904-4721-8c48-b66216330fc2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d71e63f4-8904-4721-8c48-b66216330fc2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894611 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dmk\" (UniqueName: \"kubernetes.io/projected/d71e63f4-8904-4721-8c48-b66216330fc2-kube-api-access-z8dmk\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894779 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d71e63f4-8904-4721-8c48-b66216330fc2-logs\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894830 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.894903 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.895183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d71e63f4-8904-4721-8c48-b66216330fc2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.896177 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.897418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d71e63f4-8904-4721-8c48-b66216330fc2-logs\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.899492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.900338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.906647 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.910910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d71e63f4-8904-4721-8c48-b66216330fc2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.917323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dmk\" (UniqueName: \"kubernetes.io/projected/d71e63f4-8904-4721-8c48-b66216330fc2-kube-api-access-z8dmk\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.925982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71e63f4-8904-4721-8c48-b66216330fc2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:03 crc kubenswrapper[4909]: I1002 19:39:03.936891 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d71e63f4-8904-4721-8c48-b66216330fc2\") " pod="openstack/glance-default-internal-api-0" Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.211706 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.482106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c7d49444-5plj4" event={"ID":"a982efe8-dc2e-4706-bfd3-0b14dbd266cf","Type":"ContainerStarted","Data":"dce0ba2ce7b612fca1fa781fb607e985b2f6122993ae37214dd2cedd85cba931"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.482317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c7d49444-5plj4" event={"ID":"a982efe8-dc2e-4706-bfd3-0b14dbd266cf","Type":"ContainerStarted","Data":"ee5c07143e0d6e19566c1d503d5806451fea830202aa96c9f273da97ec83ce9a"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.487755 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655568d68f-86d8j" event={"ID":"b9313bee-1007-427f-9a6a-fd93f7c4aa5b","Type":"ContainerStarted","Data":"3f689b4a98f2016c95f77a2c10973a88a6929b5a0c70e2f5cbbcb04031ca40e0"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.487787 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655568d68f-86d8j" event={"ID":"b9313bee-1007-427f-9a6a-fd93f7c4aa5b","Type":"ContainerStarted","Data":"b725d895109d85ab1a9fcc24f657dbff88ac8788cbe897cf2fee1d3db5f693b5"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.487881 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-655568d68f-86d8j" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon-log" containerID="cri-o://b725d895109d85ab1a9fcc24f657dbff88ac8788cbe897cf2fee1d3db5f693b5" gracePeriod=30 Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.487959 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-655568d68f-86d8j" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon" containerID="cri-o://3f689b4a98f2016c95f77a2c10973a88a6929b5a0c70e2f5cbbcb04031ca40e0" gracePeriod=30 Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.504257 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54b6f8f8f6-vtk9c" event={"ID":"ff545e92-3c9d-45b8-9438-89122eeef32f","Type":"ContainerStarted","Data":"a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.504292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54b6f8f8f6-vtk9c" event={"ID":"ff545e92-3c9d-45b8-9438-89122eeef32f","Type":"ContainerStarted","Data":"9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.507457 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8c7d49444-5plj4" podStartSLOduration=4.087024528 podStartE2EDuration="12.507435125s" podCreationTimestamp="2025-10-02 19:38:52 +0000 UTC" firstStartedPulling="2025-10-02 19:38:54.624638513 +0000 UTC m=+4855.812134372" lastFinishedPulling="2025-10-02 19:39:03.04504911 +0000 UTC m=+4864.232544969" observedRunningTime="2025-10-02 19:39:04.501307844 +0000 UTC m=+4865.688803693" watchObservedRunningTime="2025-10-02 19:39:04.507435125 +0000 UTC m=+4865.694930994" Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.509504 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d359125d-3d0d-4c5d-bacd-c722f9fe3116","Type":"ContainerStarted","Data":"6d3b446c97e37796ce0d662fcc459675ccfd18ade410e610b524a5520af611b9"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.509556 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d359125d-3d0d-4c5d-bacd-c722f9fe3116","Type":"ContainerStarted","Data":"bebf2feed4060aaeaa9ea1926a10e33f8d60bf22b2728882a5cb34bab97fe0f3"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.518418 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-655568d68f-86d8j" podStartSLOduration=3.677761529 podStartE2EDuration="15.518399507s" podCreationTimestamp="2025-10-02 19:38:49 +0000 UTC" firstStartedPulling="2025-10-02 19:38:51.151802941 +0000 UTC m=+4852.339298790" lastFinishedPulling="2025-10-02 19:39:02.992440899 +0000 UTC m=+4864.179936768" observedRunningTime="2025-10-02 19:39:04.517947583 +0000 UTC m=+4865.705443442" watchObservedRunningTime="2025-10-02 19:39:04.518399507 +0000 UTC m=+4865.705895366" Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.528879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dbbb67d5-fq6b9" event={"ID":"00ac5257-bc64-4a28-92a8-13b68c074824","Type":"ContainerStarted","Data":"f87a5a0bd3fdfa394f350e3d3e6ab9e199fe4a4a70f991d3626cb12b505a2f8d"} Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.529049 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8dbbb67d5-fq6b9" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon-log" containerID="cri-o://158a6747681309b253213abe46b326875ad719b4905a54dda0b0426985f87bde" gracePeriod=30 Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.529294 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8dbbb67d5-fq6b9" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon" containerID="cri-o://f87a5a0bd3fdfa394f350e3d3e6ab9e199fe4a4a70f991d3626cb12b505a2f8d" gracePeriod=30 Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.543152 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54b6f8f8f6-vtk9c" podStartSLOduration=3.6751650529999997 podStartE2EDuration="12.543137248s" podCreationTimestamp="2025-10-02 19:38:52 +0000 UTC" firstStartedPulling="2025-10-02 19:38:54.097690151 +0000 UTC m=+4855.285186010" lastFinishedPulling="2025-10-02 19:39:02.965662346 +0000 UTC m=+4864.153158205" observedRunningTime="2025-10-02 19:39:04.533967782 +0000 UTC m=+4865.721463641" watchObservedRunningTime="2025-10-02 19:39:04.543137248 +0000 UTC m=+4865.730633107" Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.558907 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8dbbb67d5-fq6b9" podStartSLOduration=3.134272865 podStartE2EDuration="14.558884069s" podCreationTimestamp="2025-10-02 19:38:50 +0000 UTC" firstStartedPulling="2025-10-02 19:38:51.496111731 +0000 UTC m=+4852.683607600" lastFinishedPulling="2025-10-02 19:39:02.920722945 +0000 UTC m=+4864.108218804" observedRunningTime="2025-10-02 19:39:04.553971106 +0000 UTC m=+4865.741466965" watchObservedRunningTime="2025-10-02 19:39:04.558884069 +0000 UTC m=+4865.746379928" Oct 02 19:39:04 crc kubenswrapper[4909]: I1002 19:39:04.787282 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 19:39:04 crc kubenswrapper[4909]: W1002 19:39:04.800384 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd71e63f4_8904_4721_8c48_b66216330fc2.slice/crio-3765c3427406fb67ab4bc1c75e2bc2d93055550a1d5ea4af8afb4604aa81b683 WatchSource:0}: Error finding container 3765c3427406fb67ab4bc1c75e2bc2d93055550a1d5ea4af8afb4604aa81b683: Status 404 returned error can't find the container with id 3765c3427406fb67ab4bc1c75e2bc2d93055550a1d5ea4af8afb4604aa81b683 Oct 02 19:39:05 crc kubenswrapper[4909]: I1002 19:39:05.547129 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d359125d-3d0d-4c5d-bacd-c722f9fe3116","Type":"ContainerStarted","Data":"76337cab8765d22fc5b4f5b69e3f739f8d82aa1d10217138101b6cd5d7507755"} Oct 02 19:39:05 crc kubenswrapper[4909]: I1002 19:39:05.558057 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d71e63f4-8904-4721-8c48-b66216330fc2","Type":"ContainerStarted","Data":"5b5d00cc0cb9ab39002fa373241161d914701e0f324251e439537b8526830d38"} Oct 02 19:39:05 crc kubenswrapper[4909]: I1002 19:39:05.558112 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d71e63f4-8904-4721-8c48-b66216330fc2","Type":"ContainerStarted","Data":"3765c3427406fb67ab4bc1c75e2bc2d93055550a1d5ea4af8afb4604aa81b683"} Oct 02 19:39:05 crc kubenswrapper[4909]: I1002 19:39:05.572349 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.572310693 podStartE2EDuration="8.572310693s" podCreationTimestamp="2025-10-02 19:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:39:05.563732886 +0000 UTC m=+4866.751228745" watchObservedRunningTime="2025-10-02 19:39:05.572310693 +0000 UTC m=+4866.759806552" Oct 02 19:39:06 crc kubenswrapper[4909]: I1002 19:39:06.587955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d71e63f4-8904-4721-8c48-b66216330fc2","Type":"ContainerStarted","Data":"31c95b34fb2fac70f5ba112f73b1365f45af85bdcfb95a19eed86844a1f1bebd"} Oct 02 19:39:06 crc kubenswrapper[4909]: I1002 19:39:06.624165 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.624143374 podStartE2EDuration="3.624143374s" podCreationTimestamp="2025-10-02 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:39:06.623574946 +0000 UTC m=+4867.811070815" watchObservedRunningTime="2025-10-02 19:39:06.624143374 +0000 UTC m=+4867.811639243" Oct 02 19:39:08 crc kubenswrapper[4909]: I1002 19:39:08.119526 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 19:39:08 crc kubenswrapper[4909]: I1002 19:39:08.121305 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 19:39:08 crc kubenswrapper[4909]: I1002 19:39:08.184897 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 19:39:08 crc kubenswrapper[4909]: I1002 19:39:08.203892 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 19:39:08 crc kubenswrapper[4909]: I1002 19:39:08.608103 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 19:39:08 crc kubenswrapper[4909]: I1002 19:39:08.608155 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 19:39:09 crc kubenswrapper[4909]: I1002 19:39:09.808200 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-bdaa-account-create-klvbh"] Oct 02 19:39:09 crc kubenswrapper[4909]: I1002 19:39:09.811126 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:09 crc kubenswrapper[4909]: I1002 19:39:09.815948 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 02 19:39:09 crc kubenswrapper[4909]: I1002 19:39:09.822174 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-bdaa-account-create-klvbh"] Oct 02 19:39:09 crc kubenswrapper[4909]: I1002 19:39:09.936799 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27ql\" (UniqueName: \"kubernetes.io/projected/525e15bf-102a-4a3d-b7e1-4184a27bec88-kube-api-access-t27ql\") pod \"manila-bdaa-account-create-klvbh\" (UID: \"525e15bf-102a-4a3d-b7e1-4184a27bec88\") " pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:10 crc kubenswrapper[4909]: I1002 19:39:10.039599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27ql\" (UniqueName: \"kubernetes.io/projected/525e15bf-102a-4a3d-b7e1-4184a27bec88-kube-api-access-t27ql\") pod \"manila-bdaa-account-create-klvbh\" (UID: \"525e15bf-102a-4a3d-b7e1-4184a27bec88\") " pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:10 crc kubenswrapper[4909]: I1002 19:39:10.070275 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27ql\" (UniqueName: \"kubernetes.io/projected/525e15bf-102a-4a3d-b7e1-4184a27bec88-kube-api-access-t27ql\") pod \"manila-bdaa-account-create-klvbh\" (UID: \"525e15bf-102a-4a3d-b7e1-4184a27bec88\") " pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:10 crc kubenswrapper[4909]: I1002 19:39:10.142163 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:10 crc kubenswrapper[4909]: I1002 19:39:10.612877 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:39:10 crc kubenswrapper[4909]: I1002 19:39:10.716546 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-bdaa-account-create-klvbh"] Oct 02 19:39:10 crc kubenswrapper[4909]: W1002 19:39:10.717269 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod525e15bf_102a_4a3d_b7e1_4184a27bec88.slice/crio-c84a347756906c6b48dce31f189f3df96f5cfad3b952628e5ac6d2b6e79d75b4 WatchSource:0}: Error finding container c84a347756906c6b48dce31f189f3df96f5cfad3b952628e5ac6d2b6e79d75b4: Status 404 returned error can't find the container with id c84a347756906c6b48dce31f189f3df96f5cfad3b952628e5ac6d2b6e79d75b4 Oct 02 19:39:10 crc kubenswrapper[4909]: I1002 19:39:10.745481 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:39:11 crc kubenswrapper[4909]: I1002 19:39:11.663343 4909 generic.go:334] "Generic (PLEG): container finished" podID="525e15bf-102a-4a3d-b7e1-4184a27bec88" containerID="1ed545246c69b8ef10256c2b695a9de8a7249188ee8027f591a4855311515190" exitCode=0 Oct 02 19:39:11 crc kubenswrapper[4909]: I1002 19:39:11.663667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-bdaa-account-create-klvbh" event={"ID":"525e15bf-102a-4a3d-b7e1-4184a27bec88","Type":"ContainerDied","Data":"1ed545246c69b8ef10256c2b695a9de8a7249188ee8027f591a4855311515190"} Oct 02 19:39:11 crc kubenswrapper[4909]: I1002 19:39:11.663696 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-bdaa-account-create-klvbh" event={"ID":"525e15bf-102a-4a3d-b7e1-4184a27bec88","Type":"ContainerStarted","Data":"c84a347756906c6b48dce31f189f3df96f5cfad3b952628e5ac6d2b6e79d75b4"} Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.029600 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.032930 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.201365 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.298777 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.300268 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.311069 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54b6f8f8f6-vtk9c" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.73:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.73:8443: connect: connection refused" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.328130 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27ql\" (UniqueName: \"kubernetes.io/projected/525e15bf-102a-4a3d-b7e1-4184a27bec88-kube-api-access-t27ql\") pod \"525e15bf-102a-4a3d-b7e1-4184a27bec88\" (UID: \"525e15bf-102a-4a3d-b7e1-4184a27bec88\") " Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.335221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525e15bf-102a-4a3d-b7e1-4184a27bec88-kube-api-access-t27ql" (OuterVolumeSpecName: "kube-api-access-t27ql") pod "525e15bf-102a-4a3d-b7e1-4184a27bec88" (UID: "525e15bf-102a-4a3d-b7e1-4184a27bec88"). InnerVolumeSpecName "kube-api-access-t27ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.413649 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.413698 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.415007 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c7d49444-5plj4" podUID="a982efe8-dc2e-4706-bfd3-0b14dbd266cf" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.74:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.74:8443: connect: connection refused" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.430708 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27ql\" (UniqueName: \"kubernetes.io/projected/525e15bf-102a-4a3d-b7e1-4184a27bec88-kube-api-access-t27ql\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.683104 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bdaa-account-create-klvbh" Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.683156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-bdaa-account-create-klvbh" event={"ID":"525e15bf-102a-4a3d-b7e1-4184a27bec88","Type":"ContainerDied","Data":"c84a347756906c6b48dce31f189f3df96f5cfad3b952628e5ac6d2b6e79d75b4"} Oct 02 19:39:13 crc kubenswrapper[4909]: I1002 19:39:13.683183 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c84a347756906c6b48dce31f189f3df96f5cfad3b952628e5ac6d2b6e79d75b4" Oct 02 19:39:14 crc kubenswrapper[4909]: I1002 19:39:14.212537 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:14 crc kubenswrapper[4909]: I1002 19:39:14.212707 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:14 crc kubenswrapper[4909]: I1002 19:39:14.313816 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:14 crc kubenswrapper[4909]: I1002 19:39:14.345710 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:14 crc kubenswrapper[4909]: I1002 19:39:14.700428 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:14 crc kubenswrapper[4909]: I1002 19:39:14.702216 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.158348 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-nfhmm"] Oct 02 19:39:15 crc kubenswrapper[4909]: E1002 19:39:15.159352 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525e15bf-102a-4a3d-b7e1-4184a27bec88" containerName="mariadb-account-create" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.159471 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="525e15bf-102a-4a3d-b7e1-4184a27bec88" containerName="mariadb-account-create" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.159931 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="525e15bf-102a-4a3d-b7e1-4184a27bec88" containerName="mariadb-account-create" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.161051 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.163237 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-27hjh" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.165430 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.169915 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-nfhmm"] Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.308263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-combined-ca-bundle\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.308311 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-job-config-data\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.308822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-config-data\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.308918 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22zq\" (UniqueName: \"kubernetes.io/projected/3f857eeb-83ae-4626-ae42-a4d52389bf79-kube-api-access-q22zq\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.410414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-combined-ca-bundle\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.410461 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-job-config-data\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.410620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-config-data\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.410653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22zq\" (UniqueName: \"kubernetes.io/projected/3f857eeb-83ae-4626-ae42-a4d52389bf79-kube-api-access-q22zq\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.416748 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-job-config-data\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.418081 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-config-data\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.418582 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-combined-ca-bundle\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.436555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22zq\" (UniqueName: \"kubernetes.io/projected/3f857eeb-83ae-4626-ae42-a4d52389bf79-kube-api-access-q22zq\") pod \"manila-db-sync-nfhmm\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:15 crc kubenswrapper[4909]: I1002 19:39:15.496740 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:16 crc kubenswrapper[4909]: I1002 19:39:16.123772 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-nfhmm"] Oct 02 19:39:16 crc kubenswrapper[4909]: W1002 19:39:16.308264 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f857eeb_83ae_4626_ae42_a4d52389bf79.slice/crio-c34bc1c95bfef77b571a1747178f09b47d50bf51a95cdc309868aff98d7d2143 WatchSource:0}: Error finding container c34bc1c95bfef77b571a1747178f09b47d50bf51a95cdc309868aff98d7d2143: Status 404 returned error can't find the container with id c34bc1c95bfef77b571a1747178f09b47d50bf51a95cdc309868aff98d7d2143 Oct 02 19:39:16 crc kubenswrapper[4909]: I1002 19:39:16.719959 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 19:39:16 crc kubenswrapper[4909]: I1002 19:39:16.719991 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 19:39:16 crc kubenswrapper[4909]: I1002 19:39:16.719949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-nfhmm" event={"ID":"3f857eeb-83ae-4626-ae42-a4d52389bf79","Type":"ContainerStarted","Data":"c34bc1c95bfef77b571a1747178f09b47d50bf51a95cdc309868aff98d7d2143"} Oct 02 19:39:17 crc kubenswrapper[4909]: I1002 19:39:17.292740 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:17 crc kubenswrapper[4909]: I1002 19:39:17.730051 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 19:39:17 crc kubenswrapper[4909]: I1002 19:39:17.795607 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 19:39:22 crc kubenswrapper[4909]: I1002 19:39:22.853378 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-nfhmm" event={"ID":"3f857eeb-83ae-4626-ae42-a4d52389bf79","Type":"ContainerStarted","Data":"d5f9b9dc3717da5f5e77da952a5f53011afcc50d33014b84cdd4e0980ed90f22"} Oct 02 19:39:22 crc kubenswrapper[4909]: I1002 19:39:22.883227 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-nfhmm" podStartSLOduration=2.231197338 podStartE2EDuration="7.883204325s" podCreationTimestamp="2025-10-02 19:39:15 +0000 UTC" firstStartedPulling="2025-10-02 19:39:16.316273854 +0000 UTC m=+4877.503769713" lastFinishedPulling="2025-10-02 19:39:21.968280831 +0000 UTC m=+4883.155776700" observedRunningTime="2025-10-02 19:39:22.87020563 +0000 UTC m=+4884.057701489" watchObservedRunningTime="2025-10-02 19:39:22.883204325 +0000 UTC m=+4884.070700184" Oct 02 19:39:23 crc kubenswrapper[4909]: I1002 19:39:23.054705 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:39:23 crc kubenswrapper[4909]: I1002 19:39:23.054763 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:39:23 crc kubenswrapper[4909]: I1002 19:39:23.299532 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54b6f8f8f6-vtk9c" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.73:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.73:8443: connect: connection refused" Oct 02 19:39:25 crc kubenswrapper[4909]: I1002 19:39:25.767676 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:39:27 crc kubenswrapper[4909]: I1002 19:39:27.439699 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8c7d49444-5plj4" Oct 02 19:39:27 crc kubenswrapper[4909]: I1002 19:39:27.512578 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54b6f8f8f6-vtk9c"] Oct 02 19:39:27 crc kubenswrapper[4909]: I1002 19:39:27.512818 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54b6f8f8f6-vtk9c" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon-log" containerID="cri-o://9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca" gracePeriod=30 Oct 02 19:39:27 crc kubenswrapper[4909]: I1002 19:39:27.512940 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54b6f8f8f6-vtk9c" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon" containerID="cri-o://a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad" gracePeriod=30 Oct 02 19:39:27 crc kubenswrapper[4909]: I1002 19:39:27.908120 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerID="a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad" exitCode=0 Oct 02 19:39:27 crc kubenswrapper[4909]: I1002 19:39:27.908203 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54b6f8f8f6-vtk9c" event={"ID":"ff545e92-3c9d-45b8-9438-89122eeef32f","Type":"ContainerDied","Data":"a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad"} Oct 02 19:39:31 crc kubenswrapper[4909]: I1002 19:39:31.963194 4909 generic.go:334] "Generic (PLEG): container finished" podID="3f857eeb-83ae-4626-ae42-a4d52389bf79" containerID="d5f9b9dc3717da5f5e77da952a5f53011afcc50d33014b84cdd4e0980ed90f22" exitCode=0 Oct 02 19:39:31 crc kubenswrapper[4909]: I1002 19:39:31.963282 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-nfhmm" event={"ID":"3f857eeb-83ae-4626-ae42-a4d52389bf79","Type":"ContainerDied","Data":"d5f9b9dc3717da5f5e77da952a5f53011afcc50d33014b84cdd4e0980ed90f22"} Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.452703 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.558765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-job-config-data\") pod \"3f857eeb-83ae-4626-ae42-a4d52389bf79\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.559181 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q22zq\" (UniqueName: \"kubernetes.io/projected/3f857eeb-83ae-4626-ae42-a4d52389bf79-kube-api-access-q22zq\") pod \"3f857eeb-83ae-4626-ae42-a4d52389bf79\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.559358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-combined-ca-bundle\") pod \"3f857eeb-83ae-4626-ae42-a4d52389bf79\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.559460 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-config-data\") pod \"3f857eeb-83ae-4626-ae42-a4d52389bf79\" (UID: \"3f857eeb-83ae-4626-ae42-a4d52389bf79\") " Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.565172 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "3f857eeb-83ae-4626-ae42-a4d52389bf79" (UID: "3f857eeb-83ae-4626-ae42-a4d52389bf79"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.567671 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f857eeb-83ae-4626-ae42-a4d52389bf79-kube-api-access-q22zq" (OuterVolumeSpecName: "kube-api-access-q22zq") pod "3f857eeb-83ae-4626-ae42-a4d52389bf79" (UID: "3f857eeb-83ae-4626-ae42-a4d52389bf79"). InnerVolumeSpecName "kube-api-access-q22zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.584549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-config-data" (OuterVolumeSpecName: "config-data") pod "3f857eeb-83ae-4626-ae42-a4d52389bf79" (UID: "3f857eeb-83ae-4626-ae42-a4d52389bf79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.590853 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f857eeb-83ae-4626-ae42-a4d52389bf79" (UID: "3f857eeb-83ae-4626-ae42-a4d52389bf79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.662620 4909 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.662653 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q22zq\" (UniqueName: \"kubernetes.io/projected/3f857eeb-83ae-4626-ae42-a4d52389bf79-kube-api-access-q22zq\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.662662 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.662670 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f857eeb-83ae-4626-ae42-a4d52389bf79-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.993248 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-nfhmm" event={"ID":"3f857eeb-83ae-4626-ae42-a4d52389bf79","Type":"ContainerDied","Data":"c34bc1c95bfef77b571a1747178f09b47d50bf51a95cdc309868aff98d7d2143"} Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.993313 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-nfhmm" Oct 02 19:39:33 crc kubenswrapper[4909]: I1002 19:39:33.993341 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c34bc1c95bfef77b571a1747178f09b47d50bf51a95cdc309868aff98d7d2143" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.271846 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:34 crc kubenswrapper[4909]: E1002 19:39:34.272835 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f857eeb-83ae-4626-ae42-a4d52389bf79" containerName="manila-db-sync" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.272858 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f857eeb-83ae-4626-ae42-a4d52389bf79" containerName="manila-db-sync" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.273178 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f857eeb-83ae-4626-ae42-a4d52389bf79" containerName="manila-db-sync" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.274366 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.278858 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-27hjh" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.279063 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.279227 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.288212 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.289937 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.300314 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.303799 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.304116 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.345082 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.380626 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.380708 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-scripts\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.380748 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.380798 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-scripts\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.380818 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.380881 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381058 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-ceph\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381116 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt2qz\" (UniqueName: \"kubernetes.io/projected/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-kube-api-access-xt2qz\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381178 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381331 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381366 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.381385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ggh8\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-kube-api-access-4ggh8\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.446326 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74cfff99f-cwz7n"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.453980 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.483954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-dns-svc\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-ceph\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/ac29833f-f7cb-4d06-96f8-3f73e527b175-kube-api-access-g4vlh\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484114 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt2qz\" (UniqueName: \"kubernetes.io/projected/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-kube-api-access-xt2qz\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484135 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-config\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484175 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484241 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.484263 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485180 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cfff99f-cwz7n"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485426 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485466 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ggh8\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-kube-api-access-4ggh8\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485532 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485568 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-dns-swift-storage-0\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-ovsdbserver-sb\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-scripts\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485723 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485822 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-scripts\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485854 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.485962 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-ovsdbserver-nb\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.486194 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.486275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.486563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.492784 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-ceph\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.492848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.492937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.495279 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.500664 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.503929 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.506055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.507715 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt2qz\" (UniqueName: \"kubernetes.io/projected/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-kube-api-access-xt2qz\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.511691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.512060 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-scripts\") pod \"manila-scheduler-0\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.514043 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ggh8\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-kube-api-access-4ggh8\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.518368 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.527396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-scripts\") pod \"manila-share-share1-0\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-dns-swift-storage-0\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-ovsdbserver-sb\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602514 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-ovsdbserver-nb\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-dns-svc\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602619 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/ac29833f-f7cb-4d06-96f8-3f73e527b175-kube-api-access-g4vlh\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.602638 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-config\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.605564 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-dns-swift-storage-0\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.606335 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.608201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-ovsdbserver-sb\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.611233 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.614680 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.621614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.624336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-ovsdbserver-nb\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.624593 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.624781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-dns-svc\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.625230 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac29833f-f7cb-4d06-96f8-3f73e527b175-config\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.626970 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.631833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/ac29833f-f7cb-4d06-96f8-3f73e527b175-kube-api-access-g4vlh\") pod \"dnsmasq-dns-74cfff99f-cwz7n\" (UID: \"ac29833f-f7cb-4d06-96f8-3f73e527b175\") " pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.637904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feab993d-863a-41d2-8568-3b24b3cc257e-etc-machine-id\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721629 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feab993d-863a-41d2-8568-3b24b3cc257e-logs\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-scripts\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57tzd\" (UniqueName: \"kubernetes.io/projected/feab993d-863a-41d2-8568-3b24b3cc257e-kube-api-access-57tzd\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721757 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data-custom\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.721978 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823420 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feab993d-863a-41d2-8568-3b24b3cc257e-etc-machine-id\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823494 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feab993d-863a-41d2-8568-3b24b3cc257e-logs\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-scripts\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823550 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57tzd\" (UniqueName: \"kubernetes.io/projected/feab993d-863a-41d2-8568-3b24b3cc257e-kube-api-access-57tzd\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.823577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data-custom\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.827610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feab993d-863a-41d2-8568-3b24b3cc257e-etc-machine-id\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.828128 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feab993d-863a-41d2-8568-3b24b3cc257e-logs\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.835356 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.836142 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-scripts\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.836560 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.836742 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data-custom\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:34 crc kubenswrapper[4909]: I1002 19:39:34.852981 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57tzd\" (UniqueName: \"kubernetes.io/projected/feab993d-863a-41d2-8568-3b24b3cc257e-kube-api-access-57tzd\") pod \"manila-api-0\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " pod="openstack/manila-api-0" Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.056410 4909 generic.go:334] "Generic (PLEG): container finished" podID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerID="3f689b4a98f2016c95f77a2c10973a88a6929b5a0c70e2f5cbbcb04031ca40e0" exitCode=137 Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.056719 4909 generic.go:334] "Generic (PLEG): container finished" podID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerID="b725d895109d85ab1a9fcc24f657dbff88ac8788cbe897cf2fee1d3db5f693b5" exitCode=137 Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.056805 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655568d68f-86d8j" event={"ID":"b9313bee-1007-427f-9a6a-fd93f7c4aa5b","Type":"ContainerDied","Data":"3f689b4a98f2016c95f77a2c10973a88a6929b5a0c70e2f5cbbcb04031ca40e0"} Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.056831 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655568d68f-86d8j" event={"ID":"b9313bee-1007-427f-9a6a-fd93f7c4aa5b","Type":"ContainerDied","Data":"b725d895109d85ab1a9fcc24f657dbff88ac8788cbe897cf2fee1d3db5f693b5"} Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.076834 4909 generic.go:334] "Generic (PLEG): container finished" podID="00ac5257-bc64-4a28-92a8-13b68c074824" containerID="f87a5a0bd3fdfa394f350e3d3e6ab9e199fe4a4a70f991d3626cb12b505a2f8d" exitCode=137 Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.076864 4909 generic.go:334] "Generic (PLEG): container finished" podID="00ac5257-bc64-4a28-92a8-13b68c074824" containerID="158a6747681309b253213abe46b326875ad719b4905a54dda0b0426985f87bde" exitCode=137 Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.076886 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dbbb67d5-fq6b9" event={"ID":"00ac5257-bc64-4a28-92a8-13b68c074824","Type":"ContainerDied","Data":"f87a5a0bd3fdfa394f350e3d3e6ab9e199fe4a4a70f991d3626cb12b505a2f8d"} Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.076915 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dbbb67d5-fq6b9" event={"ID":"00ac5257-bc64-4a28-92a8-13b68c074824","Type":"ContainerDied","Data":"158a6747681309b253213abe46b326875ad719b4905a54dda0b0426985f87bde"} Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.260972 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.282233 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.835695 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:39:35 crc kubenswrapper[4909]: I1002 19:39:35.878780 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.012752 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cfff99f-cwz7n"] Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.101784 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:36 crc kubenswrapper[4909]: W1002 19:39:36.138217 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b87fa96_97ae_4b5a_aba6_bb8470cc12bb.slice/crio-d9da1d8fadec7acdfbf212e2062c24deb0661a41532dcbaf443be0b72ddbadab WatchSource:0}: Error finding container d9da1d8fadec7acdfbf212e2062c24deb0661a41532dcbaf443be0b72ddbadab: Status 404 returned error can't find the container with id d9da1d8fadec7acdfbf212e2062c24deb0661a41532dcbaf443be0b72ddbadab Oct 02 19:39:36 crc kubenswrapper[4909]: W1002 19:39:36.147155 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac29833f_f7cb_4d06_96f8_3f73e527b175.slice/crio-331b4334042674e288b795ad2d670af1f745afba111ef442f8387f12a14fedc3 WatchSource:0}: Error finding container 331b4334042674e288b795ad2d670af1f745afba111ef442f8387f12a14fedc3: Status 404 returned error can't find the container with id 331b4334042674e288b795ad2d670af1f745afba111ef442f8387f12a14fedc3 Oct 02 19:39:36 crc kubenswrapper[4909]: W1002 19:39:36.150755 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeab993d_863a_41d2_8568_3b24b3cc257e.slice/crio-bdcce8f681facfdba0bc739b63b28f3c79f7ad4805bb2812e567cfaadbc28e42 WatchSource:0}: Error finding container bdcce8f681facfdba0bc739b63b28f3c79f7ad4805bb2812e567cfaadbc28e42: Status 404 returned error can't find the container with id bdcce8f681facfdba0bc739b63b28f3c79f7ad4805bb2812e567cfaadbc28e42 Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.495118 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.498178 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611148 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt42d\" (UniqueName: \"kubernetes.io/projected/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-kube-api-access-xt42d\") pod \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611213 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ac5257-bc64-4a28-92a8-13b68c074824-logs\") pod \"00ac5257-bc64-4a28-92a8-13b68c074824\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611238 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-config-data\") pod \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611347 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qgtq\" (UniqueName: \"kubernetes.io/projected/00ac5257-bc64-4a28-92a8-13b68c074824-kube-api-access-8qgtq\") pod \"00ac5257-bc64-4a28-92a8-13b68c074824\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611363 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-logs\") pod \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-config-data\") pod \"00ac5257-bc64-4a28-92a8-13b68c074824\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611419 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-horizon-secret-key\") pod \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611442 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-scripts\") pod \"00ac5257-bc64-4a28-92a8-13b68c074824\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00ac5257-bc64-4a28-92a8-13b68c074824-horizon-secret-key\") pod \"00ac5257-bc64-4a28-92a8-13b68c074824\" (UID: \"00ac5257-bc64-4a28-92a8-13b68c074824\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.611656 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-scripts\") pod \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\" (UID: \"b9313bee-1007-427f-9a6a-fd93f7c4aa5b\") " Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.612257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-logs" (OuterVolumeSpecName: "logs") pod "b9313bee-1007-427f-9a6a-fd93f7c4aa5b" (UID: "b9313bee-1007-427f-9a6a-fd93f7c4aa5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.613762 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ac5257-bc64-4a28-92a8-13b68c074824-logs" (OuterVolumeSpecName: "logs") pod "00ac5257-bc64-4a28-92a8-13b68c074824" (UID: "00ac5257-bc64-4a28-92a8-13b68c074824"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.630067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ac5257-bc64-4a28-92a8-13b68c074824-kube-api-access-8qgtq" (OuterVolumeSpecName: "kube-api-access-8qgtq") pod "00ac5257-bc64-4a28-92a8-13b68c074824" (UID: "00ac5257-bc64-4a28-92a8-13b68c074824"). InnerVolumeSpecName "kube-api-access-8qgtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.631250 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b9313bee-1007-427f-9a6a-fd93f7c4aa5b" (UID: "b9313bee-1007-427f-9a6a-fd93f7c4aa5b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.634643 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-kube-api-access-xt42d" (OuterVolumeSpecName: "kube-api-access-xt42d") pod "b9313bee-1007-427f-9a6a-fd93f7c4aa5b" (UID: "b9313bee-1007-427f-9a6a-fd93f7c4aa5b"). InnerVolumeSpecName "kube-api-access-xt42d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.634766 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ac5257-bc64-4a28-92a8-13b68c074824-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "00ac5257-bc64-4a28-92a8-13b68c074824" (UID: "00ac5257-bc64-4a28-92a8-13b68c074824"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.729099 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt42d\" (UniqueName: \"kubernetes.io/projected/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-kube-api-access-xt42d\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.729131 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ac5257-bc64-4a28-92a8-13b68c074824-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.729142 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qgtq\" (UniqueName: \"kubernetes.io/projected/00ac5257-bc64-4a28-92a8-13b68c074824-kube-api-access-8qgtq\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.729152 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.729160 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.729168 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00ac5257-bc64-4a28-92a8-13b68c074824-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.745985 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-config-data" (OuterVolumeSpecName: "config-data") pod "00ac5257-bc64-4a28-92a8-13b68c074824" (UID: "00ac5257-bc64-4a28-92a8-13b68c074824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.780791 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-config-data" (OuterVolumeSpecName: "config-data") pod "b9313bee-1007-427f-9a6a-fd93f7c4aa5b" (UID: "b9313bee-1007-427f-9a6a-fd93f7c4aa5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.785888 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-scripts" (OuterVolumeSpecName: "scripts") pod "00ac5257-bc64-4a28-92a8-13b68c074824" (UID: "00ac5257-bc64-4a28-92a8-13b68c074824"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.798158 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-scripts" (OuterVolumeSpecName: "scripts") pod "b9313bee-1007-427f-9a6a-fd93f7c4aa5b" (UID: "b9313bee-1007-427f-9a6a-fd93f7c4aa5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.831509 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.831540 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.831550 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9313bee-1007-427f-9a6a-fd93f7c4aa5b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:36 crc kubenswrapper[4909]: I1002 19:39:36.831560 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00ac5257-bc64-4a28-92a8-13b68c074824-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.110356 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac29833f-f7cb-4d06-96f8-3f73e527b175" containerID="8fdc0aa2fe0f7ff52aa9c4558dfacc5b4325d995cba5fa3187aa63b3ed7b3fd0" exitCode=0 Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.112452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" event={"ID":"ac29833f-f7cb-4d06-96f8-3f73e527b175","Type":"ContainerDied","Data":"8fdc0aa2fe0f7ff52aa9c4558dfacc5b4325d995cba5fa3187aa63b3ed7b3fd0"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.112485 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" event={"ID":"ac29833f-f7cb-4d06-96f8-3f73e527b175","Type":"ContainerStarted","Data":"331b4334042674e288b795ad2d670af1f745afba111ef442f8387f12a14fedc3"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.113927 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"feab993d-863a-41d2-8568-3b24b3cc257e","Type":"ContainerStarted","Data":"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.113946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"feab993d-863a-41d2-8568-3b24b3cc257e","Type":"ContainerStarted","Data":"bdcce8f681facfdba0bc739b63b28f3c79f7ad4805bb2812e567cfaadbc28e42"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.117507 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dbbb67d5-fq6b9" event={"ID":"00ac5257-bc64-4a28-92a8-13b68c074824","Type":"ContainerDied","Data":"ee6087853981a49da837142c955cc8b1afd40de02799f2da3ccdd2315a9f96d8"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.117556 4909 scope.go:117] "RemoveContainer" containerID="f87a5a0bd3fdfa394f350e3d3e6ab9e199fe4a4a70f991d3626cb12b505a2f8d" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.117710 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dbbb67d5-fq6b9" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.120664 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"59f20fc0-b824-445d-8fcb-40b809d0947f","Type":"ContainerStarted","Data":"cfd79380159f8adcf5b66ec962cb3441aa8ba7858e80a53da6e73475c702b7e3"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.122453 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655568d68f-86d8j" event={"ID":"b9313bee-1007-427f-9a6a-fd93f7c4aa5b","Type":"ContainerDied","Data":"325c57c96b3a2a09a1e5b98b681e3e743e20ce38bd1b72c6a0aff5e04636d978"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.122542 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-655568d68f-86d8j" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.127650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb","Type":"ContainerStarted","Data":"d9da1d8fadec7acdfbf212e2062c24deb0661a41532dcbaf443be0b72ddbadab"} Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.181104 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8dbbb67d5-fq6b9"] Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.206259 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8dbbb67d5-fq6b9"] Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.234047 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-655568d68f-86d8j"] Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.249670 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-655568d68f-86d8j"] Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.474364 4909 scope.go:117] "RemoveContainer" containerID="158a6747681309b253213abe46b326875ad719b4905a54dda0b0426985f87bde" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.569589 4909 scope.go:117] "RemoveContainer" containerID="3f689b4a98f2016c95f77a2c10973a88a6929b5a0c70e2f5cbbcb04031ca40e0" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.627535 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" path="/var/lib/kubelet/pods/00ac5257-bc64-4a28-92a8-13b68c074824/volumes" Oct 02 19:39:37 crc kubenswrapper[4909]: I1002 19:39:37.628987 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" path="/var/lib/kubelet/pods/b9313bee-1007-427f-9a6a-fd93f7c4aa5b/volumes" Oct 02 19:39:38 crc kubenswrapper[4909]: I1002 19:39:38.101866 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:38 crc kubenswrapper[4909]: I1002 19:39:38.436350 4909 scope.go:117] "RemoveContainer" containerID="b725d895109d85ab1a9fcc24f657dbff88ac8788cbe897cf2fee1d3db5f693b5" Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.158776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb","Type":"ContainerStarted","Data":"3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8"} Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.159220 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb","Type":"ContainerStarted","Data":"a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b"} Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.176179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" event={"ID":"ac29833f-f7cb-4d06-96f8-3f73e527b175","Type":"ContainerStarted","Data":"560e02bdb09b7f47a5b6dedfe96a7ef6dae361eb363882f6f4b26e793d76eded"} Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.176392 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.184221 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.279743315 podStartE2EDuration="5.184207222s" podCreationTimestamp="2025-10-02 19:39:34 +0000 UTC" firstStartedPulling="2025-10-02 19:39:36.147345358 +0000 UTC m=+4897.334841207" lastFinishedPulling="2025-10-02 19:39:37.051809245 +0000 UTC m=+4898.239305114" observedRunningTime="2025-10-02 19:39:39.175331136 +0000 UTC m=+4900.362826995" watchObservedRunningTime="2025-10-02 19:39:39.184207222 +0000 UTC m=+4900.371703081" Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.192833 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"feab993d-863a-41d2-8568-3b24b3cc257e","Type":"ContainerStarted","Data":"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3"} Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.192956 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api-log" containerID="cri-o://eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441" gracePeriod=30 Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.193090 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.193126 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api" containerID="cri-o://2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3" gracePeriod=30 Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.208993 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" podStartSLOduration=5.208971354 podStartE2EDuration="5.208971354s" podCreationTimestamp="2025-10-02 19:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:39:39.202222264 +0000 UTC m=+4900.389718123" watchObservedRunningTime="2025-10-02 19:39:39.208971354 +0000 UTC m=+4900.396467213" Oct 02 19:39:39 crc kubenswrapper[4909]: I1002 19:39:39.232558 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.232538089 podStartE2EDuration="5.232538089s" podCreationTimestamp="2025-10-02 19:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:39:39.220610427 +0000 UTC m=+4900.408106286" watchObservedRunningTime="2025-10-02 19:39:39.232538089 +0000 UTC m=+4900.420033948" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.167462 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206338 4909 generic.go:334] "Generic (PLEG): container finished" podID="feab993d-863a-41d2-8568-3b24b3cc257e" containerID="2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3" exitCode=0 Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206375 4909 generic.go:334] "Generic (PLEG): container finished" podID="feab993d-863a-41d2-8568-3b24b3cc257e" containerID="eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441" exitCode=143 Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206407 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"feab993d-863a-41d2-8568-3b24b3cc257e","Type":"ContainerDied","Data":"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3"} Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206470 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"feab993d-863a-41d2-8568-3b24b3cc257e","Type":"ContainerDied","Data":"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441"} Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206484 4909 scope.go:117] "RemoveContainer" containerID="2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.206490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"feab993d-863a-41d2-8568-3b24b3cc257e","Type":"ContainerDied","Data":"bdcce8f681facfdba0bc739b63b28f3c79f7ad4805bb2812e567cfaadbc28e42"} Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.244842 4909 scope.go:117] "RemoveContainer" containerID="eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.278138 4909 scope.go:117] "RemoveContainer" containerID="2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.279119 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3\": container with ID starting with 2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3 not found: ID does not exist" containerID="2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.279152 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3"} err="failed to get container status \"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3\": rpc error: code = NotFound desc = could not find container \"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3\": container with ID starting with 2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3 not found: ID does not exist" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.279173 4909 scope.go:117] "RemoveContainer" containerID="eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.279424 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441\": container with ID starting with eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441 not found: ID does not exist" containerID="eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.279482 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441"} err="failed to get container status \"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441\": rpc error: code = NotFound desc = could not find container \"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441\": container with ID starting with eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441 not found: ID does not exist" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.279496 4909 scope.go:117] "RemoveContainer" containerID="2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.279811 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3"} err="failed to get container status \"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3\": rpc error: code = NotFound desc = could not find container \"2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3\": container with ID starting with 2c47cd2a603debe43070f1f8545b4d890339f54da25361e97480c4e0270515e3 not found: ID does not exist" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.279870 4909 scope.go:117] "RemoveContainer" containerID="eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.280270 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441"} err="failed to get container status \"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441\": rpc error: code = NotFound desc = could not find container \"eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441\": container with ID starting with eb8c91f06095193bde2f79f84cf181f9b5029836ff3e69ed272cf688af277441 not found: ID does not exist" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.352802 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57tzd\" (UniqueName: \"kubernetes.io/projected/feab993d-863a-41d2-8568-3b24b3cc257e-kube-api-access-57tzd\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.352979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-combined-ca-bundle\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.353004 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data-custom\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.353242 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feab993d-863a-41d2-8568-3b24b3cc257e-logs\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.353265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feab993d-863a-41d2-8568-3b24b3cc257e-etc-machine-id\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.353290 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-scripts\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.353317 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data\") pod \"feab993d-863a-41d2-8568-3b24b3cc257e\" (UID: \"feab993d-863a-41d2-8568-3b24b3cc257e\") " Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.356443 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feab993d-863a-41d2-8568-3b24b3cc257e-logs" (OuterVolumeSpecName: "logs") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.356570 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/feab993d-863a-41d2-8568-3b24b3cc257e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.375255 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-scripts" (OuterVolumeSpecName: "scripts") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.375366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feab993d-863a-41d2-8568-3b24b3cc257e-kube-api-access-57tzd" (OuterVolumeSpecName: "kube-api-access-57tzd") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "kube-api-access-57tzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.375356 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.396042 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.424568 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data" (OuterVolumeSpecName: "config-data") pod "feab993d-863a-41d2-8568-3b24b3cc257e" (UID: "feab993d-863a-41d2-8568-3b24b3cc257e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456783 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456817 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456830 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feab993d-863a-41d2-8568-3b24b3cc257e-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456843 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feab993d-863a-41d2-8568-3b24b3cc257e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456854 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456864 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feab993d-863a-41d2-8568-3b24b3cc257e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.456877 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57tzd\" (UniqueName: \"kubernetes.io/projected/feab993d-863a-41d2-8568-3b24b3cc257e-kube-api-access-57tzd\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.594378 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.607983 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.619235 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.619496 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-notification-agent" containerID="cri-o://4fd3acd9964168038d5cd23fb6ce81e8dcc3ce147ef98d00e703f8cb9484ed5d" gracePeriod=30 Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.619890 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-central-agent" containerID="cri-o://2e719bf839e5244b72c7c63a7c45850173203ef6e7cc30d052704a825fd95864" gracePeriod=30 Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.619935 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="proxy-httpd" containerID="cri-o://7ae9166ea8d6783d5c427192faca28616a0e7c117c04e58f5a622c6432d83ccd" gracePeriod=30 Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.619967 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="sg-core" containerID="cri-o://5e807d69683a3f44c396751eda53f710e39c98cf4d75d32c8a2df1a4f263ca8f" gracePeriod=30 Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.630552 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.631240 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631260 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.631269 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631276 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.631313 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631319 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon-log" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.631335 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631340 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.631352 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631358 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon-log" Oct 02 19:39:40 crc kubenswrapper[4909]: E1002 19:39:40.631370 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631377 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631582 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631596 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631609 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631620 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9313bee-1007-427f-9a6a-fd93f7c4aa5b" containerName="horizon" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631635 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" containerName="manila-api-log" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.631645 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ac5257-bc64-4a28-92a8-13b68c074824" containerName="horizon" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.632833 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.635565 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.637015 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.637205 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.670137 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.769098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-scripts\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.769448 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-internal-tls-certs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.769487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-config-data\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.769501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1680d2-9b8d-4c73-bf55-58ad455baa81-logs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.769710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ws5d\" (UniqueName: \"kubernetes.io/projected/ba1680d2-9b8d-4c73-bf55-58ad455baa81-kube-api-access-6ws5d\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.769922 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-config-data-custom\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.770111 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.770296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-public-tls-certs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.770512 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba1680d2-9b8d-4c73-bf55-58ad455baa81-etc-machine-id\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.872473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba1680d2-9b8d-4c73-bf55-58ad455baa81-etc-machine-id\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.872600 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-internal-tls-certs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.872608 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba1680d2-9b8d-4c73-bf55-58ad455baa81-etc-machine-id\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.872629 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-scripts\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.873558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-config-data\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.873593 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1680d2-9b8d-4c73-bf55-58ad455baa81-logs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.873993 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ws5d\" (UniqueName: \"kubernetes.io/projected/ba1680d2-9b8d-4c73-bf55-58ad455baa81-kube-api-access-6ws5d\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.874418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1680d2-9b8d-4c73-bf55-58ad455baa81-logs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.874615 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-config-data-custom\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.874793 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.875428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-public-tls-certs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.877610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-scripts\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.877923 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-config-data\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.878421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-internal-tls-certs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.880444 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-public-tls-certs\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.880515 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.898014 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ws5d\" (UniqueName: \"kubernetes.io/projected/ba1680d2-9b8d-4c73-bf55-58ad455baa81-kube-api-access-6ws5d\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.926859 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba1680d2-9b8d-4c73-bf55-58ad455baa81-config-data-custom\") pod \"manila-api-0\" (UID: \"ba1680d2-9b8d-4c73-bf55-58ad455baa81\") " pod="openstack/manila-api-0" Oct 02 19:39:40 crc kubenswrapper[4909]: I1002 19:39:40.978804 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.231543 4909 generic.go:334] "Generic (PLEG): container finished" podID="555cac57-752b-43f7-80f6-2d768759cad4" containerID="2e719bf839e5244b72c7c63a7c45850173203ef6e7cc30d052704a825fd95864" exitCode=0 Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.231843 4909 generic.go:334] "Generic (PLEG): container finished" podID="555cac57-752b-43f7-80f6-2d768759cad4" containerID="7ae9166ea8d6783d5c427192faca28616a0e7c117c04e58f5a622c6432d83ccd" exitCode=0 Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.231858 4909 generic.go:334] "Generic (PLEG): container finished" podID="555cac57-752b-43f7-80f6-2d768759cad4" containerID="5e807d69683a3f44c396751eda53f710e39c98cf4d75d32c8a2df1a4f263ca8f" exitCode=2 Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.231929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerDied","Data":"2e719bf839e5244b72c7c63a7c45850173203ef6e7cc30d052704a825fd95864"} Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.231962 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerDied","Data":"7ae9166ea8d6783d5c427192faca28616a0e7c117c04e58f5a622c6432d83ccd"} Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.231977 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerDied","Data":"5e807d69683a3f44c396751eda53f710e39c98cf4d75d32c8a2df1a4f263ca8f"} Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.626071 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feab993d-863a-41d2-8568-3b24b3cc257e" path="/var/lib/kubelet/pods/feab993d-863a-41d2-8568-3b24b3cc257e/volumes" Oct 02 19:39:41 crc kubenswrapper[4909]: I1002 19:39:41.804378 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 19:39:42 crc kubenswrapper[4909]: I1002 19:39:42.247233 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ba1680d2-9b8d-4c73-bf55-58ad455baa81","Type":"ContainerStarted","Data":"02538aaa1339e52777a6256e8aed7051a3c8ccb3b88b6c06b60b80f6c369a9d2"} Oct 02 19:39:42 crc kubenswrapper[4909]: I1002 19:39:42.251196 4909 generic.go:334] "Generic (PLEG): container finished" podID="555cac57-752b-43f7-80f6-2d768759cad4" containerID="4fd3acd9964168038d5cd23fb6ce81e8dcc3ce147ef98d00e703f8cb9484ed5d" exitCode=0 Oct 02 19:39:42 crc kubenswrapper[4909]: I1002 19:39:42.251242 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerDied","Data":"4fd3acd9964168038d5cd23fb6ce81e8dcc3ce147ef98d00e703f8cb9484ed5d"} Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.643676 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.764992 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.781563 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-scripts\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.781666 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-run-httpd\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.781726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-ceilometer-tls-certs\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.781807 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-combined-ca-bundle\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.781854 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-sg-core-conf-yaml\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.781948 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnr5h\" (UniqueName: \"kubernetes.io/projected/555cac57-752b-43f7-80f6-2d768759cad4-kube-api-access-nnr5h\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.782251 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-config-data\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.782313 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-log-httpd\") pod \"555cac57-752b-43f7-80f6-2d768759cad4\" (UID: \"555cac57-752b-43f7-80f6-2d768759cad4\") " Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.783104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.783609 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.783748 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.789021 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-scripts" (OuterVolumeSpecName: "scripts") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.795197 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555cac57-752b-43f7-80f6-2d768759cad4-kube-api-access-nnr5h" (OuterVolumeSpecName: "kube-api-access-nnr5h") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "kube-api-access-nnr5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.844944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.886187 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.886224 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnr5h\" (UniqueName: \"kubernetes.io/projected/555cac57-752b-43f7-80f6-2d768759cad4-kube-api-access-nnr5h\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.886236 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/555cac57-752b-43f7-80f6-2d768759cad4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.886245 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.896394 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.911572 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.963349 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-config-data" (OuterVolumeSpecName: "config-data") pod "555cac57-752b-43f7-80f6-2d768759cad4" (UID: "555cac57-752b-43f7-80f6-2d768759cad4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.987854 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.988083 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:44 crc kubenswrapper[4909]: I1002 19:39:44.988146 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555cac57-752b-43f7-80f6-2d768759cad4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.263160 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74cfff99f-cwz7n" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.293895 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"555cac57-752b-43f7-80f6-2d768759cad4","Type":"ContainerDied","Data":"da77ab77fadc5aef99d136ef3706bec6027dac72b8a11a654b70f48c9e82ca27"} Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.294161 4909 scope.go:117] "RemoveContainer" containerID="2e719bf839e5244b72c7c63a7c45850173203ef6e7cc30d052704a825fd95864" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.293940 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.300278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"59f20fc0-b824-445d-8fcb-40b809d0947f","Type":"ContainerStarted","Data":"97959de4dc1fa1dcdabdab31fb308cc563b35386d7445f80ec08a9535905bfb0"} Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.302935 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ba1680d2-9b8d-4c73-bf55-58ad455baa81","Type":"ContainerStarted","Data":"bff09d1399406918f1a341a89cb2c903f2271dcceff023a82276c7dcb240395b"} Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.302958 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ba1680d2-9b8d-4c73-bf55-58ad455baa81","Type":"ContainerStarted","Data":"bd10f8d9323f73e373c896d9edeff47a6741a18c2523e010d44d7fea5be4098a"} Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.304003 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.333862 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768b698657-jrj72"] Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.334095 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768b698657-jrj72" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" containerName="dnsmasq-dns" containerID="cri-o://9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2" gracePeriod=10 Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.448401 4909 scope.go:117] "RemoveContainer" containerID="7ae9166ea8d6783d5c427192faca28616a0e7c117c04e58f5a622c6432d83ccd" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.460970 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.46095258 podStartE2EDuration="5.46095258s" podCreationTimestamp="2025-10-02 19:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:39:45.360159379 +0000 UTC m=+4906.547655238" watchObservedRunningTime="2025-10-02 19:39:45.46095258 +0000 UTC m=+4906.648448439" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.470111 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.485726 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.512249 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:45 crc kubenswrapper[4909]: E1002 19:39:45.512859 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="sg-core" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.512889 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="sg-core" Oct 02 19:39:45 crc kubenswrapper[4909]: E1002 19:39:45.512913 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="proxy-httpd" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.512922 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="proxy-httpd" Oct 02 19:39:45 crc kubenswrapper[4909]: E1002 19:39:45.512944 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-central-agent" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.512952 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-central-agent" Oct 02 19:39:45 crc kubenswrapper[4909]: E1002 19:39:45.512977 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-notification-agent" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.512983 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-notification-agent" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.513188 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="sg-core" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.513209 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-central-agent" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.513237 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="ceilometer-notification-agent" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.513254 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="555cac57-752b-43f7-80f6-2d768759cad4" containerName="proxy-httpd" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.537179 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.537305 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.541250 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.541423 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.541558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.565827 4909 scope.go:117] "RemoveContainer" containerID="5e807d69683a3f44c396751eda53f710e39c98cf4d75d32c8a2df1a4f263ca8f" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.602242 4909 scope.go:117] "RemoveContainer" containerID="4fd3acd9964168038d5cd23fb6ce81e8dcc3ce147ef98d00e703f8cb9484ed5d" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610500 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-scripts\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-run-httpd\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-config-data\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610621 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-log-httpd\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z925d\" (UniqueName: \"kubernetes.io/projected/9023d22c-f526-4840-9f51-2d55982270e4-kube-api-access-z925d\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.610757 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.634645 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555cac57-752b-43f7-80f6-2d768759cad4" path="/var/lib/kubelet/pods/555cac57-752b-43f7-80f6-2d768759cad4/volumes" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-scripts\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-run-httpd\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-config-data\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-log-httpd\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z925d\" (UniqueName: \"kubernetes.io/projected/9023d22c-f526-4840-9f51-2d55982270e4-kube-api-access-z925d\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.714817 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.716603 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-run-httpd\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:45 crc kubenswrapper[4909]: I1002 19:39:45.716796 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-log-httpd\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.182910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.183614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-config-data\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.184000 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.184516 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.184853 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-scripts\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.185781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z925d\" (UniqueName: \"kubernetes.io/projected/9023d22c-f526-4840-9f51-2d55982270e4-kube-api-access-z925d\") pod \"ceilometer-0\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.316439 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.317587 4909 generic.go:334] "Generic (PLEG): container finished" podID="21f44fe5-d38a-40f0-9649-de47086a080a" containerID="9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2" exitCode=0 Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.317643 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-jrj72" event={"ID":"21f44fe5-d38a-40f0-9649-de47086a080a","Type":"ContainerDied","Data":"9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2"} Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.317682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-jrj72" event={"ID":"21f44fe5-d38a-40f0-9649-de47086a080a","Type":"ContainerDied","Data":"fd60f184f5b890203faab9277fc169103915a0925afc9c8b19f9911f8fc1c2ed"} Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.317700 4909 scope.go:117] "RemoveContainer" containerID="9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.321506 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"59f20fc0-b824-445d-8fcb-40b809d0947f","Type":"ContainerStarted","Data":"013277b39663f849dbc8c073d9ccab59cc7d1b42e172408db33dfb254df18fbc"} Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326051 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-nb\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326091 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-config\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326150 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-openstack-edpm-ipam\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326196 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-svc\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326220 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-sb\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326256 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-swift-storage-0\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.326329 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vp96\" (UniqueName: \"kubernetes.io/projected/21f44fe5-d38a-40f0-9649-de47086a080a-kube-api-access-5vp96\") pod \"21f44fe5-d38a-40f0-9649-de47086a080a\" (UID: \"21f44fe5-d38a-40f0-9649-de47086a080a\") " Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.347644 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f44fe5-d38a-40f0-9649-de47086a080a-kube-api-access-5vp96" (OuterVolumeSpecName: "kube-api-access-5vp96") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "kube-api-access-5vp96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.363223 4909 scope.go:117] "RemoveContainer" containerID="2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.382345 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.235275418 podStartE2EDuration="12.382323805s" podCreationTimestamp="2025-10-02 19:39:34 +0000 UTC" firstStartedPulling="2025-10-02 19:39:36.17793254 +0000 UTC m=+4897.365428399" lastFinishedPulling="2025-10-02 19:39:44.324980927 +0000 UTC m=+4905.512476786" observedRunningTime="2025-10-02 19:39:46.379900159 +0000 UTC m=+4907.567396028" watchObservedRunningTime="2025-10-02 19:39:46.382323805 +0000 UTC m=+4907.569819664" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.428845 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vp96\" (UniqueName: \"kubernetes.io/projected/21f44fe5-d38a-40f0-9649-de47086a080a-kube-api-access-5vp96\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.464354 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.614548 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.615156 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.618226 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.650112 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.650140 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.650150 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.717767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.728837 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-config" (OuterVolumeSpecName: "config") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.729038 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "21f44fe5-d38a-40f0-9649-de47086a080a" (UID: "21f44fe5-d38a-40f0-9649-de47086a080a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.752213 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.752249 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:46 crc kubenswrapper[4909]: I1002 19:39:46.752262 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f44fe5-d38a-40f0-9649-de47086a080a-config\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.305000 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.332692 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-jrj72" Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.381701 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768b698657-jrj72"] Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.392874 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768b698657-jrj72"] Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.501474 4909 scope.go:117] "RemoveContainer" containerID="9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2" Oct 02 19:39:47 crc kubenswrapper[4909]: E1002 19:39:47.502054 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2\": container with ID starting with 9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2 not found: ID does not exist" containerID="9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2" Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.502186 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2"} err="failed to get container status \"9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2\": rpc error: code = NotFound desc = could not find container \"9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2\": container with ID starting with 9d93677fccc27dbb58bd9b1238f9062627ac1932d810f820bbc8ba84de0cc5a2 not found: ID does not exist" Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.502338 4909 scope.go:117] "RemoveContainer" containerID="2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8" Oct 02 19:39:47 crc kubenswrapper[4909]: E1002 19:39:47.502835 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8\": container with ID starting with 2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8 not found: ID does not exist" containerID="2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8" Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.502867 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8"} err="failed to get container status \"2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8\": rpc error: code = NotFound desc = could not find container \"2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8\": container with ID starting with 2393a99ca8010cd92bdf5939f63716bfb06ae022528f40a7239ff1fb7d10b3b8 not found: ID does not exist" Oct 02 19:39:47 crc kubenswrapper[4909]: I1002 19:39:47.620581 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" path="/var/lib/kubelet/pods/21f44fe5-d38a-40f0-9649-de47086a080a/volumes" Oct 02 19:39:48 crc kubenswrapper[4909]: I1002 19:39:48.344151 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerStarted","Data":"53f2aed195e859083bc4ba3ee8dad0bdd77cbcc77dbee2129e218e84c6326c53"} Oct 02 19:39:49 crc kubenswrapper[4909]: I1002 19:39:49.091085 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:49 crc kubenswrapper[4909]: I1002 19:39:49.355161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerStarted","Data":"62a39006f05970b240e4e3d6b7f9d65173d51df2931ad3c06af6c9639f86c7d7"} Oct 02 19:39:49 crc kubenswrapper[4909]: I1002 19:39:49.355206 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerStarted","Data":"9df591a0a494c6d7cf2e88c3b389dd45b3060b07812f3d478642d957a2342100"} Oct 02 19:39:50 crc kubenswrapper[4909]: I1002 19:39:50.369013 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerStarted","Data":"503e21fc7c0b6e39ab42732ed319261e8ac90395bda4ec07b26450f378816137"} Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.405291 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerStarted","Data":"b2bc26d3a0adc0479c56be6b11dcce79fcc3aed446195e1719b5b6c7cc25f987"} Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.405995 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.405883 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-central-agent" containerID="cri-o://9df591a0a494c6d7cf2e88c3b389dd45b3060b07812f3d478642d957a2342100" gracePeriod=30 Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.406494 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="sg-core" containerID="cri-o://503e21fc7c0b6e39ab42732ed319261e8ac90395bda4ec07b26450f378816137" gracePeriod=30 Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.406593 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-notification-agent" containerID="cri-o://62a39006f05970b240e4e3d6b7f9d65173d51df2931ad3c06af6c9639f86c7d7" gracePeriod=30 Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.406594 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="proxy-httpd" containerID="cri-o://b2bc26d3a0adc0479c56be6b11dcce79fcc3aed446195e1719b5b6c7cc25f987" gracePeriod=30 Oct 02 19:39:52 crc kubenswrapper[4909]: I1002 19:39:52.448628 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.8170212340000003 podStartE2EDuration="7.448591363s" podCreationTimestamp="2025-10-02 19:39:45 +0000 UTC" firstStartedPulling="2025-10-02 19:39:47.511628901 +0000 UTC m=+4908.699124760" lastFinishedPulling="2025-10-02 19:39:51.14319903 +0000 UTC m=+4912.330694889" observedRunningTime="2025-10-02 19:39:52.442633467 +0000 UTC m=+4913.630129346" watchObservedRunningTime="2025-10-02 19:39:52.448591363 +0000 UTC m=+4913.636087222" Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.054189 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.054559 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.054611 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.055436 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5616c14478f166c035c2131cff63074d735206236b9d9a783fc76cd787dbc5ac"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.055493 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://5616c14478f166c035c2131cff63074d735206236b9d9a783fc76cd787dbc5ac" gracePeriod=600 Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.420174 4909 generic.go:334] "Generic (PLEG): container finished" podID="9023d22c-f526-4840-9f51-2d55982270e4" containerID="b2bc26d3a0adc0479c56be6b11dcce79fcc3aed446195e1719b5b6c7cc25f987" exitCode=0 Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.420430 4909 generic.go:334] "Generic (PLEG): container finished" podID="9023d22c-f526-4840-9f51-2d55982270e4" containerID="503e21fc7c0b6e39ab42732ed319261e8ac90395bda4ec07b26450f378816137" exitCode=2 Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.420443 4909 generic.go:334] "Generic (PLEG): container finished" podID="9023d22c-f526-4840-9f51-2d55982270e4" containerID="62a39006f05970b240e4e3d6b7f9d65173d51df2931ad3c06af6c9639f86c7d7" exitCode=0 Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.420484 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerDied","Data":"b2bc26d3a0adc0479c56be6b11dcce79fcc3aed446195e1719b5b6c7cc25f987"} Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.420509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerDied","Data":"503e21fc7c0b6e39ab42732ed319261e8ac90395bda4ec07b26450f378816137"} Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.420519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerDied","Data":"62a39006f05970b240e4e3d6b7f9d65173d51df2931ad3c06af6c9639f86c7d7"} Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.424102 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="5616c14478f166c035c2131cff63074d735206236b9d9a783fc76cd787dbc5ac" exitCode=0 Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.424131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"5616c14478f166c035c2131cff63074d735206236b9d9a783fc76cd787dbc5ac"} Oct 02 19:39:53 crc kubenswrapper[4909]: I1002 19:39:53.424154 4909 scope.go:117] "RemoveContainer" containerID="195aa5c8185ffdd688b0bb2cab971f14eb5a9e130475ac4b74b78c912c9eb9e5" Oct 02 19:39:54 crc kubenswrapper[4909]: I1002 19:39:54.439499 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40"} Oct 02 19:39:54 crc kubenswrapper[4909]: I1002 19:39:54.607116 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.113515 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.194239 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.228318 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.274041 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.462715 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="manila-share" containerID="cri-o://97959de4dc1fa1dcdabdab31fb308cc563b35386d7445f80ec08a9535905bfb0" gracePeriod=30 Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.462845 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="probe" containerID="cri-o://013277b39663f849dbc8c073d9ccab59cc7d1b42e172408db33dfb254df18fbc" gracePeriod=30 Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.462939 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="probe" containerID="cri-o://3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8" gracePeriod=30 Oct 02 19:39:56 crc kubenswrapper[4909]: I1002 19:39:56.462866 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="manila-scheduler" containerID="cri-o://a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b" gracePeriod=30 Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.481982 4909 generic.go:334] "Generic (PLEG): container finished" podID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerID="3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8" exitCode=0 Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.485425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb","Type":"ContainerDied","Data":"3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8"} Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.488946 4909 generic.go:334] "Generic (PLEG): container finished" podID="9023d22c-f526-4840-9f51-2d55982270e4" containerID="9df591a0a494c6d7cf2e88c3b389dd45b3060b07812f3d478642d957a2342100" exitCode=0 Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.489004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerDied","Data":"9df591a0a494c6d7cf2e88c3b389dd45b3060b07812f3d478642d957a2342100"} Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.490959 4909 generic.go:334] "Generic (PLEG): container finished" podID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerID="013277b39663f849dbc8c073d9ccab59cc7d1b42e172408db33dfb254df18fbc" exitCode=0 Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.490975 4909 generic.go:334] "Generic (PLEG): container finished" podID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerID="97959de4dc1fa1dcdabdab31fb308cc563b35386d7445f80ec08a9535905bfb0" exitCode=1 Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.490994 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"59f20fc0-b824-445d-8fcb-40b809d0947f","Type":"ContainerDied","Data":"013277b39663f849dbc8c073d9ccab59cc7d1b42e172408db33dfb254df18fbc"} Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.491008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"59f20fc0-b824-445d-8fcb-40b809d0947f","Type":"ContainerDied","Data":"97959de4dc1fa1dcdabdab31fb308cc563b35386d7445f80ec08a9535905bfb0"} Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.901605 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984657 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-config-data\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984712 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-log-httpd\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-ceilometer-tls-certs\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984854 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-combined-ca-bundle\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984889 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z925d\" (UniqueName: \"kubernetes.io/projected/9023d22c-f526-4840-9f51-2d55982270e4-kube-api-access-z925d\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-run-httpd\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-sg-core-conf-yaml\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.984979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-scripts\") pod \"9023d22c-f526-4840-9f51-2d55982270e4\" (UID: \"9023d22c-f526-4840-9f51-2d55982270e4\") " Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.985362 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.985931 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.991328 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-scripts" (OuterVolumeSpecName: "scripts") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:57 crc kubenswrapper[4909]: I1002 19:39:57.992532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9023d22c-f526-4840-9f51-2d55982270e4-kube-api-access-z925d" (OuterVolumeSpecName: "kube-api-access-z925d") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "kube-api-access-z925d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.024185 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.087264 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.087306 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z925d\" (UniqueName: \"kubernetes.io/projected/9023d22c-f526-4840-9f51-2d55982270e4-kube-api-access-z925d\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.087319 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9023d22c-f526-4840-9f51-2d55982270e4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.087330 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.087340 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.107332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.124128 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.164254 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-config-data" (OuterVolumeSpecName: "config-data") pod "9023d22c-f526-4840-9f51-2d55982270e4" (UID: "9023d22c-f526-4840-9f51-2d55982270e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.167995 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189418 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-combined-ca-bundle\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189457 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-etc-machine-id\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189540 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-var-lib-manila\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189576 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data-custom\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189626 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-ceph\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189693 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-scripts\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.189861 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ggh8\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-kube-api-access-4ggh8\") pod \"59f20fc0-b824-445d-8fcb-40b809d0947f\" (UID: \"59f20fc0-b824-445d-8fcb-40b809d0947f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.190412 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.190428 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.190438 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023d22c-f526-4840-9f51-2d55982270e4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.193537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.194200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-kube-api-access-4ggh8" (OuterVolumeSpecName: "kube-api-access-4ggh8") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "kube-api-access-4ggh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.197220 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.197281 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.200274 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-scripts" (OuterVolumeSpecName: "scripts") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.212945 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-ceph" (OuterVolumeSpecName: "ceph") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.255796 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.290786 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-tls-certs\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.291337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-secret-key\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.292339 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-config-data\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.292415 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff545e92-3c9d-45b8-9438-89122eeef32f-logs\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.292534 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7kgv\" (UniqueName: \"kubernetes.io/projected/ff545e92-3c9d-45b8-9438-89122eeef32f-kube-api-access-k7kgv\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.292640 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-scripts\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.292757 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-combined-ca-bundle\") pod \"ff545e92-3c9d-45b8-9438-89122eeef32f\" (UID: \"ff545e92-3c9d-45b8-9438-89122eeef32f\") " Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.293046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff545e92-3c9d-45b8-9438-89122eeef32f-logs" (OuterVolumeSpecName: "logs") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.293513 4909 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.293584 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.293696 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.293752 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.294320 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ggh8\" (UniqueName: \"kubernetes.io/projected/59f20fc0-b824-445d-8fcb-40b809d0947f-kube-api-access-4ggh8\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.294404 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59f20fc0-b824-445d-8fcb-40b809d0947f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.294459 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff545e92-3c9d-45b8-9438-89122eeef32f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.295077 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.297718 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff545e92-3c9d-45b8-9438-89122eeef32f-kube-api-access-k7kgv" (OuterVolumeSpecName: "kube-api-access-k7kgv") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "kube-api-access-k7kgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.311622 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.329527 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-config-data" (OuterVolumeSpecName: "config-data") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.339487 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.343743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-scripts" (OuterVolumeSpecName: "scripts") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.358110 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ff545e92-3c9d-45b8-9438-89122eeef32f" (UID: "ff545e92-3c9d-45b8-9438-89122eeef32f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.359651 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data" (OuterVolumeSpecName: "config-data") pod "59f20fc0-b824-445d-8fcb-40b809d0947f" (UID: "59f20fc0-b824-445d-8fcb-40b809d0947f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396266 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396307 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396319 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396329 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f20fc0-b824-445d-8fcb-40b809d0947f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396339 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff545e92-3c9d-45b8-9438-89122eeef32f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396350 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396358 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7kgv\" (UniqueName: \"kubernetes.io/projected/ff545e92-3c9d-45b8-9438-89122eeef32f-kube-api-access-k7kgv\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.396367 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff545e92-3c9d-45b8-9438-89122eeef32f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.507334 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerID="9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca" exitCode=137 Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.507412 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54b6f8f8f6-vtk9c" event={"ID":"ff545e92-3c9d-45b8-9438-89122eeef32f","Type":"ContainerDied","Data":"9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca"} Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.507445 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54b6f8f8f6-vtk9c" event={"ID":"ff545e92-3c9d-45b8-9438-89122eeef32f","Type":"ContainerDied","Data":"0cd87e955892847bfb746dab46d293b8570bf6530b620197c63e8a4f2df0c6db"} Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.507463 4909 scope.go:117] "RemoveContainer" containerID="a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.507601 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54b6f8f8f6-vtk9c" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.515366 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.515374 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"59f20fc0-b824-445d-8fcb-40b809d0947f","Type":"ContainerDied","Data":"cfd79380159f8adcf5b66ec962cb3441aa8ba7858e80a53da6e73475c702b7e3"} Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.522428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9023d22c-f526-4840-9f51-2d55982270e4","Type":"ContainerDied","Data":"53f2aed195e859083bc4ba3ee8dad0bdd77cbcc77dbee2129e218e84c6326c53"} Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.522522 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.578708 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.613077 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.625169 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54b6f8f8f6-vtk9c"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.643983 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54b6f8f8f6-vtk9c"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.671809 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.673966 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="probe" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674053 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="probe" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674109 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" containerName="init" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674123 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" containerName="init" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674139 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" containerName="dnsmasq-dns" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674148 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" containerName="dnsmasq-dns" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674171 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-notification-agent" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674181 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-notification-agent" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674204 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674213 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674233 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="manila-share" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674244 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="manila-share" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674297 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="proxy-httpd" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674309 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="proxy-httpd" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674406 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon-log" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674417 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon-log" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674473 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-central-agent" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674484 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-central-agent" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.674507 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="sg-core" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.674516 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="sg-core" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675212 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-central-agent" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675259 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon-log" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675285 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="ceilometer-notification-agent" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675304 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f44fe5-d38a-40f0-9649-de47086a080a" containerName="dnsmasq-dns" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675316 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="manila-share" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675360 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" containerName="horizon" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675375 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="sg-core" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675400 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9023d22c-f526-4840-9f51-2d55982270e4" containerName="proxy-httpd" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.675437 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" containerName="probe" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.682128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.684421 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.689048 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.689640 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.698969 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.716233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.727145 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.737780 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.741444 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.743329 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.751004 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.767151 4909 scope.go:117] "RemoveContainer" containerID="9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.811627 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.811694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89b8\" (UniqueName: \"kubernetes.io/projected/5e3dec40-d45b-44d5-858e-72e56c62dfed-kube-api-access-d89b8\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.811828 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.812047 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-scripts\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.812138 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3dec40-d45b-44d5-858e-72e56c62dfed-log-httpd\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.812258 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3dec40-d45b-44d5-858e-72e56c62dfed-run-httpd\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.812344 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-config-data\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.812577 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.862667 4909 scope.go:117] "RemoveContainer" containerID="a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.863348 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad\": container with ID starting with a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad not found: ID does not exist" containerID="a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.863400 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad"} err="failed to get container status \"a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad\": rpc error: code = NotFound desc = could not find container \"a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad\": container with ID starting with a738f4214badc44a45fa40a46502b0fa2394c860bc50426833918afc50c0d9ad not found: ID does not exist" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.863432 4909 scope.go:117] "RemoveContainer" containerID="9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca" Oct 02 19:39:58 crc kubenswrapper[4909]: E1002 19:39:58.863923 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca\": container with ID starting with 9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca not found: ID does not exist" containerID="9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.863950 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca"} err="failed to get container status \"9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca\": rpc error: code = NotFound desc = could not find container \"9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca\": container with ID starting with 9e604d71d7e918440dcd5f3671ed6eaf73370aaa7a3e730966d782e2aa0f8eca not found: ID does not exist" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.863971 4909 scope.go:117] "RemoveContainer" containerID="013277b39663f849dbc8c073d9ccab59cc7d1b42e172408db33dfb254df18fbc" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915007 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3dec40-d45b-44d5-858e-72e56c62dfed-run-httpd\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-config-data\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16f4bb3d-8601-40aa-bef4-026dc559b7a9-ceph\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67z9b\" (UniqueName: \"kubernetes.io/projected/16f4bb3d-8601-40aa-bef4-026dc559b7a9-kube-api-access-67z9b\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-config-data\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915279 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/16f4bb3d-8601-40aa-bef4-026dc559b7a9-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915388 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915393 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3dec40-d45b-44d5-858e-72e56c62dfed-run-httpd\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915413 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915481 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d89b8\" (UniqueName: \"kubernetes.io/projected/5e3dec40-d45b-44d5-858e-72e56c62dfed-kube-api-access-d89b8\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-scripts\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915609 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-scripts\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16f4bb3d-8601-40aa-bef4-026dc559b7a9-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.915758 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3dec40-d45b-44d5-858e-72e56c62dfed-log-httpd\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.916236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3dec40-d45b-44d5-858e-72e56c62dfed-log-httpd\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.920629 4909 scope.go:117] "RemoveContainer" containerID="97959de4dc1fa1dcdabdab31fb308cc563b35386d7445f80ec08a9535905bfb0" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.967292 4909 scope.go:117] "RemoveContainer" containerID="b2bc26d3a0adc0479c56be6b11dcce79fcc3aed446195e1719b5b6c7cc25f987" Oct 02 19:39:58 crc kubenswrapper[4909]: I1002 19:39:58.988714 4909 scope.go:117] "RemoveContainer" containerID="503e21fc7c0b6e39ab42732ed319261e8ac90395bda4ec07b26450f378816137" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16f4bb3d-8601-40aa-bef4-026dc559b7a9-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16f4bb3d-8601-40aa-bef4-026dc559b7a9-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16f4bb3d-8601-40aa-bef4-026dc559b7a9-ceph\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017534 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67z9b\" (UniqueName: \"kubernetes.io/projected/16f4bb3d-8601-40aa-bef4-026dc559b7a9-kube-api-access-67z9b\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-config-data\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017747 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/16f4bb3d-8601-40aa-bef4-026dc559b7a9-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017818 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/16f4bb3d-8601-40aa-bef4-026dc559b7a9-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.017949 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-scripts\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.018008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.023597 4909 scope.go:117] "RemoveContainer" containerID="62a39006f05970b240e4e3d6b7f9d65173d51df2931ad3c06af6c9639f86c7d7" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.066171 4909 scope.go:117] "RemoveContainer" containerID="9df591a0a494c6d7cf2e88c3b389dd45b3060b07812f3d478642d957a2342100" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.286848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.286954 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-config-data\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.287918 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.292309 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.292553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3dec40-d45b-44d5-858e-72e56c62dfed-scripts\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.293086 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89b8\" (UniqueName: \"kubernetes.io/projected/5e3dec40-d45b-44d5-858e-72e56c62dfed-kube-api-access-d89b8\") pod \"ceilometer-0\" (UID: \"5e3dec40-d45b-44d5-858e-72e56c62dfed\") " pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.293973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-config-data\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.296389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16f4bb3d-8601-40aa-bef4-026dc559b7a9-ceph\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.297911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-scripts\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.298008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.298418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16f4bb3d-8601-40aa-bef4-026dc559b7a9-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.298611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67z9b\" (UniqueName: \"kubernetes.io/projected/16f4bb3d-8601-40aa-bef4-026dc559b7a9-kube-api-access-67z9b\") pod \"manila-share-share1-0\" (UID: \"16f4bb3d-8601-40aa-bef4-026dc559b7a9\") " pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.316056 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.376646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.624173 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f20fc0-b824-445d-8fcb-40b809d0947f" path="/var/lib/kubelet/pods/59f20fc0-b824-445d-8fcb-40b809d0947f/volumes" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.625418 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9023d22c-f526-4840-9f51-2d55982270e4" path="/var/lib/kubelet/pods/9023d22c-f526-4840-9f51-2d55982270e4/volumes" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.626668 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff545e92-3c9d-45b8-9438-89122eeef32f" path="/var/lib/kubelet/pods/ff545e92-3c9d-45b8-9438-89122eeef32f/volumes" Oct 02 19:39:59 crc kubenswrapper[4909]: I1002 19:39:59.862105 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 19:40:00 crc kubenswrapper[4909]: I1002 19:40:00.100336 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 19:40:00 crc kubenswrapper[4909]: I1002 19:40:00.576413 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"16f4bb3d-8601-40aa-bef4-026dc559b7a9","Type":"ContainerStarted","Data":"3ebc148b4c0a9fb13c515e9141e5a13349cb946a9ce2c4a7916539901bd31400"} Oct 02 19:40:00 crc kubenswrapper[4909]: I1002 19:40:00.578680 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e3dec40-d45b-44d5-858e-72e56c62dfed","Type":"ContainerStarted","Data":"08d9298743242d7169dc4d37f0b151ceaf4c6607cd3d1220e1fb5d3bf7ba3bee"} Oct 02 19:40:00 crc kubenswrapper[4909]: I1002 19:40:00.578718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e3dec40-d45b-44d5-858e-72e56c62dfed","Type":"ContainerStarted","Data":"7ec50140dcb20b2500455cd2435193556f02f4dc114684bfc0407e40c38673de"} Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.131553 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.204667 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-etc-machine-id\") pod \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.204729 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt2qz\" (UniqueName: \"kubernetes.io/projected/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-kube-api-access-xt2qz\") pod \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.204793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data\") pod \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.204864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-combined-ca-bundle\") pod \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.204981 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data-custom\") pod \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.205002 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" (UID: "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.205112 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-scripts\") pod \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\" (UID: \"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb\") " Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.205690 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.213167 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" (UID: "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.213249 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-scripts" (OuterVolumeSpecName: "scripts") pod "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" (UID: "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.213402 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-kube-api-access-xt2qz" (OuterVolumeSpecName: "kube-api-access-xt2qz") pod "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" (UID: "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb"). InnerVolumeSpecName "kube-api-access-xt2qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.266515 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" (UID: "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.307729 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.307759 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.307768 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.307777 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt2qz\" (UniqueName: \"kubernetes.io/projected/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-kube-api-access-xt2qz\") on node \"crc\" DevicePath \"\"" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.352671 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data" (OuterVolumeSpecName: "config-data") pod "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" (UID: "1b87fa96-97ae-4b5a-aba6-bb8470cc12bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.410898 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.653852 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e3dec40-d45b-44d5-858e-72e56c62dfed","Type":"ContainerStarted","Data":"dc8e8b0f12172e53ffb6cc59346d3a9c9dce3c483219fdbda538a8bb2f9ca0c6"} Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.673170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"16f4bb3d-8601-40aa-bef4-026dc559b7a9","Type":"ContainerStarted","Data":"be628bb3026495be923a618a5613efc3dfa79c468046f594326adde5309b48da"} Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.673439 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"16f4bb3d-8601-40aa-bef4-026dc559b7a9","Type":"ContainerStarted","Data":"3f5e11c2d1edaef9a08cf7bf083168b9d2260e9ee4949e86491694eeba7aacfd"} Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.677761 4909 generic.go:334] "Generic (PLEG): container finished" podID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerID="a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b" exitCode=0 Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.677803 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb","Type":"ContainerDied","Data":"a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b"} Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.677826 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1b87fa96-97ae-4b5a-aba6-bb8470cc12bb","Type":"ContainerDied","Data":"d9da1d8fadec7acdfbf212e2062c24deb0661a41532dcbaf443be0b72ddbadab"} Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.677842 4909 scope.go:117] "RemoveContainer" containerID="3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.677952 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.717998 4909 scope.go:117] "RemoveContainer" containerID="a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.719273 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.719257138 podStartE2EDuration="3.719257138s" podCreationTimestamp="2025-10-02 19:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:40:01.699351668 +0000 UTC m=+4922.886847527" watchObservedRunningTime="2025-10-02 19:40:01.719257138 +0000 UTC m=+4922.906752997" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.731687 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.745511 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.782658 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:40:01 crc kubenswrapper[4909]: E1002 19:40:01.783139 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="manila-scheduler" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.783151 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="manila-scheduler" Oct 02 19:40:01 crc kubenswrapper[4909]: E1002 19:40:01.783196 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="probe" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.783204 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="probe" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.783468 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="manila-scheduler" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.783492 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" containerName="probe" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.784808 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.788068 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.792756 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.797254 4909 scope.go:117] "RemoveContainer" containerID="3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8" Oct 02 19:40:01 crc kubenswrapper[4909]: E1002 19:40:01.798675 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8\": container with ID starting with 3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8 not found: ID does not exist" containerID="3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.798717 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8"} err="failed to get container status \"3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8\": rpc error: code = NotFound desc = could not find container \"3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8\": container with ID starting with 3244cd1ca4366dedcfa07ec5a0cf5eada2079f090567ff7ddf453511ae918ac8 not found: ID does not exist" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.798751 4909 scope.go:117] "RemoveContainer" containerID="a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b" Oct 02 19:40:01 crc kubenswrapper[4909]: E1002 19:40:01.799653 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b\": container with ID starting with a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b not found: ID does not exist" containerID="a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.799677 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b"} err="failed to get container status \"a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b\": rpc error: code = NotFound desc = could not find container \"a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b\": container with ID starting with a0ee85a819b2f7c61a222bb6869bbeef8d35474fb8e1c4e86e9702d029910c4b not found: ID does not exist" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.819782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.819830 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-scripts\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.819930 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6np2\" (UniqueName: \"kubernetes.io/projected/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-kube-api-access-d6np2\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.819981 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-config-data\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.820131 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.820251 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.925321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6np2\" (UniqueName: \"kubernetes.io/projected/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-kube-api-access-d6np2\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.925405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-config-data\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.925480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.925544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.925606 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.925626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-scripts\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.926281 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.929852 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-config-data\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.931237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-scripts\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.934492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.934734 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:01 crc kubenswrapper[4909]: I1002 19:40:01.945763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6np2\" (UniqueName: \"kubernetes.io/projected/1364e9a8-caf2-48b7-bf59-5405d8c7ce96-kube-api-access-d6np2\") pod \"manila-scheduler-0\" (UID: \"1364e9a8-caf2-48b7-bf59-5405d8c7ce96\") " pod="openstack/manila-scheduler-0" Oct 02 19:40:02 crc kubenswrapper[4909]: I1002 19:40:02.124534 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 19:40:02 crc kubenswrapper[4909]: I1002 19:40:02.701609 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e3dec40-d45b-44d5-858e-72e56c62dfed","Type":"ContainerStarted","Data":"90980e8d9cbf47b68d4fa1a83a546e58220806aab9b9a42cdb4abe67634ad228"} Oct 02 19:40:02 crc kubenswrapper[4909]: I1002 19:40:02.777548 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 19:40:02 crc kubenswrapper[4909]: W1002 19:40:02.782448 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1364e9a8_caf2_48b7_bf59_5405d8c7ce96.slice/crio-11b746a915f71dd84a8c78b30c3cd6f912739439989cffa07fd662cfb6e0a449 WatchSource:0}: Error finding container 11b746a915f71dd84a8c78b30c3cd6f912739439989cffa07fd662cfb6e0a449: Status 404 returned error can't find the container with id 11b746a915f71dd84a8c78b30c3cd6f912739439989cffa07fd662cfb6e0a449 Oct 02 19:40:02 crc kubenswrapper[4909]: I1002 19:40:02.797996 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 02 19:40:03 crc kubenswrapper[4909]: I1002 19:40:03.628755 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b87fa96-97ae-4b5a-aba6-bb8470cc12bb" path="/var/lib/kubelet/pods/1b87fa96-97ae-4b5a-aba6-bb8470cc12bb/volumes" Oct 02 19:40:03 crc kubenswrapper[4909]: I1002 19:40:03.737701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1364e9a8-caf2-48b7-bf59-5405d8c7ce96","Type":"ContainerStarted","Data":"7af6658618f136255ac0bef4b43e946d04389d5e4767c18fbb2f093b073cdda2"} Oct 02 19:40:03 crc kubenswrapper[4909]: I1002 19:40:03.737994 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1364e9a8-caf2-48b7-bf59-5405d8c7ce96","Type":"ContainerStarted","Data":"11b746a915f71dd84a8c78b30c3cd6f912739439989cffa07fd662cfb6e0a449"} Oct 02 19:40:04 crc kubenswrapper[4909]: I1002 19:40:04.749709 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1364e9a8-caf2-48b7-bf59-5405d8c7ce96","Type":"ContainerStarted","Data":"0d99746884d34afb055b7868063e2490736f30b7ab15902c74bccb189324b1ca"} Oct 02 19:40:04 crc kubenswrapper[4909]: I1002 19:40:04.753269 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e3dec40-d45b-44d5-858e-72e56c62dfed","Type":"ContainerStarted","Data":"cca49e2332235cbbb625bef03bd8a81f73c716ab7b89295db9206c2e80fe156e"} Oct 02 19:40:04 crc kubenswrapper[4909]: I1002 19:40:04.753478 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 19:40:04 crc kubenswrapper[4909]: I1002 19:40:04.778010 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.777980845 podStartE2EDuration="3.777980845s" podCreationTimestamp="2025-10-02 19:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:40:04.775771196 +0000 UTC m=+4925.963267065" watchObservedRunningTime="2025-10-02 19:40:04.777980845 +0000 UTC m=+4925.965476744" Oct 02 19:40:04 crc kubenswrapper[4909]: I1002 19:40:04.817829 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.221266048 podStartE2EDuration="6.817798566s" podCreationTimestamp="2025-10-02 19:39:58 +0000 UTC" firstStartedPulling="2025-10-02 19:39:59.877911002 +0000 UTC m=+4921.065406861" lastFinishedPulling="2025-10-02 19:40:03.47444352 +0000 UTC m=+4924.661939379" observedRunningTime="2025-10-02 19:40:04.801003402 +0000 UTC m=+4925.988499301" watchObservedRunningTime="2025-10-02 19:40:04.817798566 +0000 UTC m=+4926.005294465" Oct 02 19:40:09 crc kubenswrapper[4909]: I1002 19:40:09.377718 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 02 19:40:12 crc kubenswrapper[4909]: I1002 19:40:12.125498 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 02 19:40:20 crc kubenswrapper[4909]: I1002 19:40:20.817887 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 02 19:40:23 crc kubenswrapper[4909]: I1002 19:40:23.780308 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 02 19:40:29 crc kubenswrapper[4909]: I1002 19:40:29.332107 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.763189 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rkzsx"] Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.768287 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.783008 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rkzsx"] Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.830582 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-utilities\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.830763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-catalog-content\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.830860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8spcw\" (UniqueName: \"kubernetes.io/projected/5da06af9-02d2-48cd-ad57-0e34e10f7f73-kube-api-access-8spcw\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.933427 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-catalog-content\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.933511 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8spcw\" (UniqueName: \"kubernetes.io/projected/5da06af9-02d2-48cd-ad57-0e34e10f7f73-kube-api-access-8spcw\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.933731 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-utilities\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.934084 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-catalog-content\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.934154 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-utilities\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:38 crc kubenswrapper[4909]: I1002 19:40:38.952975 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8spcw\" (UniqueName: \"kubernetes.io/projected/5da06af9-02d2-48cd-ad57-0e34e10f7f73-kube-api-access-8spcw\") pod \"redhat-operators-rkzsx\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:39 crc kubenswrapper[4909]: I1002 19:40:39.096407 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:39 crc kubenswrapper[4909]: I1002 19:40:39.629763 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rkzsx"] Oct 02 19:40:40 crc kubenswrapper[4909]: I1002 19:40:40.205366 4909 generic.go:334] "Generic (PLEG): container finished" podID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerID="01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25" exitCode=0 Oct 02 19:40:40 crc kubenswrapper[4909]: I1002 19:40:40.205708 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerDied","Data":"01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25"} Oct 02 19:40:40 crc kubenswrapper[4909]: I1002 19:40:40.205777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerStarted","Data":"dc858832ad5dcc5cac35369aba40ca8adc0d2c9ba96a8d74992bb7758a97e4e3"} Oct 02 19:40:42 crc kubenswrapper[4909]: I1002 19:40:42.236169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerStarted","Data":"54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17"} Oct 02 19:40:47 crc kubenswrapper[4909]: I1002 19:40:47.318427 4909 generic.go:334] "Generic (PLEG): container finished" podID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerID="54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17" exitCode=0 Oct 02 19:40:47 crc kubenswrapper[4909]: I1002 19:40:47.318553 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerDied","Data":"54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17"} Oct 02 19:40:48 crc kubenswrapper[4909]: I1002 19:40:48.349056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerStarted","Data":"43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10"} Oct 02 19:40:48 crc kubenswrapper[4909]: I1002 19:40:48.369438 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rkzsx" podStartSLOduration=2.84270831 podStartE2EDuration="10.369414814s" podCreationTimestamp="2025-10-02 19:40:38 +0000 UTC" firstStartedPulling="2025-10-02 19:40:40.209948029 +0000 UTC m=+4961.397443898" lastFinishedPulling="2025-10-02 19:40:47.736654533 +0000 UTC m=+4968.924150402" observedRunningTime="2025-10-02 19:40:48.367827244 +0000 UTC m=+4969.555323103" watchObservedRunningTime="2025-10-02 19:40:48.369414814 +0000 UTC m=+4969.556910713" Oct 02 19:40:49 crc kubenswrapper[4909]: I1002 19:40:49.097193 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:49 crc kubenswrapper[4909]: I1002 19:40:49.097263 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:40:50 crc kubenswrapper[4909]: I1002 19:40:50.143604 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rkzsx" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="registry-server" probeResult="failure" output=< Oct 02 19:40:50 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:40:50 crc kubenswrapper[4909]: > Oct 02 19:41:00 crc kubenswrapper[4909]: I1002 19:41:00.150212 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rkzsx" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="registry-server" probeResult="failure" output=< Oct 02 19:41:00 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:41:00 crc kubenswrapper[4909]: > Oct 02 19:41:09 crc kubenswrapper[4909]: I1002 19:41:09.164180 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:41:09 crc kubenswrapper[4909]: I1002 19:41:09.228829 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:41:09 crc kubenswrapper[4909]: I1002 19:41:09.946248 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rkzsx"] Oct 02 19:41:10 crc kubenswrapper[4909]: I1002 19:41:10.616453 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rkzsx" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="registry-server" containerID="cri-o://43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10" gracePeriod=2 Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.124136 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.192686 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-catalog-content\") pod \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.192949 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8spcw\" (UniqueName: \"kubernetes.io/projected/5da06af9-02d2-48cd-ad57-0e34e10f7f73-kube-api-access-8spcw\") pod \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.192978 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-utilities\") pod \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\" (UID: \"5da06af9-02d2-48cd-ad57-0e34e10f7f73\") " Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.194081 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-utilities" (OuterVolumeSpecName: "utilities") pod "5da06af9-02d2-48cd-ad57-0e34e10f7f73" (UID: "5da06af9-02d2-48cd-ad57-0e34e10f7f73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.199516 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da06af9-02d2-48cd-ad57-0e34e10f7f73-kube-api-access-8spcw" (OuterVolumeSpecName: "kube-api-access-8spcw") pod "5da06af9-02d2-48cd-ad57-0e34e10f7f73" (UID: "5da06af9-02d2-48cd-ad57-0e34e10f7f73"). InnerVolumeSpecName "kube-api-access-8spcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.294243 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5da06af9-02d2-48cd-ad57-0e34e10f7f73" (UID: "5da06af9-02d2-48cd-ad57-0e34e10f7f73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.295965 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.295989 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8spcw\" (UniqueName: \"kubernetes.io/projected/5da06af9-02d2-48cd-ad57-0e34e10f7f73-kube-api-access-8spcw\") on node \"crc\" DevicePath \"\"" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.295999 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da06af9-02d2-48cd-ad57-0e34e10f7f73-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.631213 4909 generic.go:334] "Generic (PLEG): container finished" podID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerID="43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10" exitCode=0 Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.631538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerDied","Data":"43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10"} Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.631564 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkzsx" event={"ID":"5da06af9-02d2-48cd-ad57-0e34e10f7f73","Type":"ContainerDied","Data":"dc858832ad5dcc5cac35369aba40ca8adc0d2c9ba96a8d74992bb7758a97e4e3"} Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.631580 4909 scope.go:117] "RemoveContainer" containerID="43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.631699 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rkzsx" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.667462 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rkzsx"] Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.679570 4909 scope.go:117] "RemoveContainer" containerID="54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.682473 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rkzsx"] Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.714071 4909 scope.go:117] "RemoveContainer" containerID="01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.768550 4909 scope.go:117] "RemoveContainer" containerID="43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10" Oct 02 19:41:11 crc kubenswrapper[4909]: E1002 19:41:11.768883 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10\": container with ID starting with 43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10 not found: ID does not exist" containerID="43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.768918 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10"} err="failed to get container status \"43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10\": rpc error: code = NotFound desc = could not find container \"43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10\": container with ID starting with 43e5982067b89aabd80cfdc9b4d34e6cd2eb93b1cfd8e770ccf14262c090af10 not found: ID does not exist" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.768939 4909 scope.go:117] "RemoveContainer" containerID="54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17" Oct 02 19:41:11 crc kubenswrapper[4909]: E1002 19:41:11.769299 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17\": container with ID starting with 54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17 not found: ID does not exist" containerID="54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.769339 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17"} err="failed to get container status \"54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17\": rpc error: code = NotFound desc = could not find container \"54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17\": container with ID starting with 54f88803fb8877bdd5b945d7a72334825b0f2d6ae642a074474029aae7517d17 not found: ID does not exist" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.769384 4909 scope.go:117] "RemoveContainer" containerID="01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25" Oct 02 19:41:11 crc kubenswrapper[4909]: E1002 19:41:11.769650 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25\": container with ID starting with 01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25 not found: ID does not exist" containerID="01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25" Oct 02 19:41:11 crc kubenswrapper[4909]: I1002 19:41:11.769681 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25"} err="failed to get container status \"01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25\": rpc error: code = NotFound desc = could not find container \"01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25\": container with ID starting with 01674355c5c17fb104ae239abb517d5805c0761e5f73599e1bc8f2256adaee25 not found: ID does not exist" Oct 02 19:41:13 crc kubenswrapper[4909]: I1002 19:41:13.622881 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" path="/var/lib/kubelet/pods/5da06af9-02d2-48cd-ad57-0e34e10f7f73/volumes" Oct 02 19:41:33 crc kubenswrapper[4909]: E1002 19:41:33.419808 4909 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.129:50616->38.102.83.129:45429: read tcp 38.102.83.129:50616->38.102.83.129:45429: read: connection reset by peer Oct 02 19:41:33 crc kubenswrapper[4909]: E1002 19:41:33.419873 4909 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:50616->38.102.83.129:45429: write tcp 38.102.83.129:50616->38.102.83.129:45429: write: broken pipe Oct 02 19:41:53 crc kubenswrapper[4909]: I1002 19:41:53.054749 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:41:53 crc kubenswrapper[4909]: I1002 19:41:53.056433 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:42:23 crc kubenswrapper[4909]: I1002 19:42:23.054560 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:42:23 crc kubenswrapper[4909]: I1002 19:42:23.055098 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.054884 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.055566 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.055612 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.056600 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.056702 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" gracePeriod=600 Oct 02 19:42:53 crc kubenswrapper[4909]: E1002 19:42:53.177636 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.836843 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" exitCode=0 Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.836945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40"} Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.837339 4909 scope.go:117] "RemoveContainer" containerID="5616c14478f166c035c2131cff63074d735206236b9d9a783fc76cd787dbc5ac" Oct 02 19:42:53 crc kubenswrapper[4909]: I1002 19:42:53.838474 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:42:53 crc kubenswrapper[4909]: E1002 19:42:53.839258 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:43:09 crc kubenswrapper[4909]: I1002 19:43:09.623554 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:43:09 crc kubenswrapper[4909]: E1002 19:43:09.625488 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:43:22 crc kubenswrapper[4909]: I1002 19:43:22.609315 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:43:22 crc kubenswrapper[4909]: E1002 19:43:22.610864 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:43:36 crc kubenswrapper[4909]: I1002 19:43:36.609413 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:43:36 crc kubenswrapper[4909]: E1002 19:43:36.610317 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:43:51 crc kubenswrapper[4909]: I1002 19:43:51.611303 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:43:51 crc kubenswrapper[4909]: E1002 19:43:51.612326 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:44:03 crc kubenswrapper[4909]: I1002 19:44:03.613358 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:44:03 crc kubenswrapper[4909]: E1002 19:44:03.614280 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:44:16 crc kubenswrapper[4909]: I1002 19:44:16.608386 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:44:16 crc kubenswrapper[4909]: E1002 19:44:16.609671 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:44:29 crc kubenswrapper[4909]: I1002 19:44:29.621528 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:44:29 crc kubenswrapper[4909]: E1002 19:44:29.623934 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:44:43 crc kubenswrapper[4909]: I1002 19:44:43.610014 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:44:43 crc kubenswrapper[4909]: E1002 19:44:43.610879 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:44:56 crc kubenswrapper[4909]: I1002 19:44:56.609395 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:44:56 crc kubenswrapper[4909]: E1002 19:44:56.610304 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.146564 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g"] Oct 02 19:45:00 crc kubenswrapper[4909]: E1002 19:45:00.147628 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="extract-utilities" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.147644 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="extract-utilities" Oct 02 19:45:00 crc kubenswrapper[4909]: E1002 19:45:00.147676 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="registry-server" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.147685 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="registry-server" Oct 02 19:45:00 crc kubenswrapper[4909]: E1002 19:45:00.147708 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="extract-content" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.147717 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="extract-content" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.147988 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da06af9-02d2-48cd-ad57-0e34e10f7f73" containerName="registry-server" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.148915 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.151686 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.151753 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.160966 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g"] Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.313532 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbj2c\" (UniqueName: \"kubernetes.io/projected/eb10e292-a690-4516-928f-d6d18d85426b-kube-api-access-fbj2c\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.313683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb10e292-a690-4516-928f-d6d18d85426b-config-volume\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.313724 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb10e292-a690-4516-928f-d6d18d85426b-secret-volume\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.416790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbj2c\" (UniqueName: \"kubernetes.io/projected/eb10e292-a690-4516-928f-d6d18d85426b-kube-api-access-fbj2c\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.417179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb10e292-a690-4516-928f-d6d18d85426b-config-volume\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.417841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb10e292-a690-4516-928f-d6d18d85426b-secret-volume\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.418538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb10e292-a690-4516-928f-d6d18d85426b-config-volume\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.442870 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbj2c\" (UniqueName: \"kubernetes.io/projected/eb10e292-a690-4516-928f-d6d18d85426b-kube-api-access-fbj2c\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.442982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb10e292-a690-4516-928f-d6d18d85426b-secret-volume\") pod \"collect-profiles-29323905-rth9g\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.509334 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:00 crc kubenswrapper[4909]: I1002 19:45:00.981893 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g"] Oct 02 19:45:00 crc kubenswrapper[4909]: W1002 19:45:00.988449 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb10e292_a690_4516_928f_d6d18d85426b.slice/crio-cef4488c6bc8dda3fc0ec2921e502fc1fd16d84362117088d5ea4ac989888b6c WatchSource:0}: Error finding container cef4488c6bc8dda3fc0ec2921e502fc1fd16d84362117088d5ea4ac989888b6c: Status 404 returned error can't find the container with id cef4488c6bc8dda3fc0ec2921e502fc1fd16d84362117088d5ea4ac989888b6c Oct 02 19:45:01 crc kubenswrapper[4909]: I1002 19:45:01.313892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" event={"ID":"eb10e292-a690-4516-928f-d6d18d85426b","Type":"ContainerStarted","Data":"6eb9f48fb84bef14011a8f04a225c0fdc2923990f0517e858d21a2afea0c9622"} Oct 02 19:45:01 crc kubenswrapper[4909]: I1002 19:45:01.314394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" event={"ID":"eb10e292-a690-4516-928f-d6d18d85426b","Type":"ContainerStarted","Data":"cef4488c6bc8dda3fc0ec2921e502fc1fd16d84362117088d5ea4ac989888b6c"} Oct 02 19:45:01 crc kubenswrapper[4909]: I1002 19:45:01.344670 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" podStartSLOduration=1.344648332 podStartE2EDuration="1.344648332s" podCreationTimestamp="2025-10-02 19:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 19:45:01.332143252 +0000 UTC m=+5222.519639141" watchObservedRunningTime="2025-10-02 19:45:01.344648332 +0000 UTC m=+5222.532144201" Oct 02 19:45:03 crc kubenswrapper[4909]: I1002 19:45:03.339220 4909 generic.go:334] "Generic (PLEG): container finished" podID="eb10e292-a690-4516-928f-d6d18d85426b" containerID="6eb9f48fb84bef14011a8f04a225c0fdc2923990f0517e858d21a2afea0c9622" exitCode=0 Oct 02 19:45:03 crc kubenswrapper[4909]: I1002 19:45:03.339695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" event={"ID":"eb10e292-a690-4516-928f-d6d18d85426b","Type":"ContainerDied","Data":"6eb9f48fb84bef14011a8f04a225c0fdc2923990f0517e858d21a2afea0c9622"} Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.823758 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.933214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbj2c\" (UniqueName: \"kubernetes.io/projected/eb10e292-a690-4516-928f-d6d18d85426b-kube-api-access-fbj2c\") pod \"eb10e292-a690-4516-928f-d6d18d85426b\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.933750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb10e292-a690-4516-928f-d6d18d85426b-config-volume\") pod \"eb10e292-a690-4516-928f-d6d18d85426b\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.933981 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb10e292-a690-4516-928f-d6d18d85426b-secret-volume\") pod \"eb10e292-a690-4516-928f-d6d18d85426b\" (UID: \"eb10e292-a690-4516-928f-d6d18d85426b\") " Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.934311 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb10e292-a690-4516-928f-d6d18d85426b-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb10e292-a690-4516-928f-d6d18d85426b" (UID: "eb10e292-a690-4516-928f-d6d18d85426b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.934884 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb10e292-a690-4516-928f-d6d18d85426b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.954822 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb10e292-a690-4516-928f-d6d18d85426b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb10e292-a690-4516-928f-d6d18d85426b" (UID: "eb10e292-a690-4516-928f-d6d18d85426b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 19:45:04 crc kubenswrapper[4909]: I1002 19:45:04.954931 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb10e292-a690-4516-928f-d6d18d85426b-kube-api-access-fbj2c" (OuterVolumeSpecName: "kube-api-access-fbj2c") pod "eb10e292-a690-4516-928f-d6d18d85426b" (UID: "eb10e292-a690-4516-928f-d6d18d85426b"). InnerVolumeSpecName "kube-api-access-fbj2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.036973 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb10e292-a690-4516-928f-d6d18d85426b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.037251 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbj2c\" (UniqueName: \"kubernetes.io/projected/eb10e292-a690-4516-928f-d6d18d85426b-kube-api-access-fbj2c\") on node \"crc\" DevicePath \"\"" Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.364078 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" event={"ID":"eb10e292-a690-4516-928f-d6d18d85426b","Type":"ContainerDied","Data":"cef4488c6bc8dda3fc0ec2921e502fc1fd16d84362117088d5ea4ac989888b6c"} Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.364479 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cef4488c6bc8dda3fc0ec2921e502fc1fd16d84362117088d5ea4ac989888b6c" Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.364124 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g" Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.447386 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr"] Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.461964 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323860-qrpjr"] Oct 02 19:45:05 crc kubenswrapper[4909]: I1002 19:45:05.624054 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4cecbb-b3da-484e-be90-00ecbf10449f" path="/var/lib/kubelet/pods/8f4cecbb-b3da-484e-be90-00ecbf10449f/volumes" Oct 02 19:45:07 crc kubenswrapper[4909]: I1002 19:45:07.608818 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:45:07 crc kubenswrapper[4909]: E1002 19:45:07.610284 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:45:20 crc kubenswrapper[4909]: I1002 19:45:20.615762 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:45:20 crc kubenswrapper[4909]: E1002 19:45:20.616748 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:45:26 crc kubenswrapper[4909]: I1002 19:45:26.293387 4909 scope.go:117] "RemoveContainer" containerID="c63922d5348508f261363f90e12cef7efc5f9dad1277b07f4660007cb35a2be3" Oct 02 19:45:34 crc kubenswrapper[4909]: I1002 19:45:34.609143 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:45:34 crc kubenswrapper[4909]: E1002 19:45:34.609990 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:45:45 crc kubenswrapper[4909]: I1002 19:45:45.609238 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:45:45 crc kubenswrapper[4909]: E1002 19:45:45.610353 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:45:56 crc kubenswrapper[4909]: I1002 19:45:56.875291 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fbssz"] Oct 02 19:45:56 crc kubenswrapper[4909]: E1002 19:45:56.876613 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb10e292-a690-4516-928f-d6d18d85426b" containerName="collect-profiles" Oct 02 19:45:56 crc kubenswrapper[4909]: I1002 19:45:56.876636 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb10e292-a690-4516-928f-d6d18d85426b" containerName="collect-profiles" Oct 02 19:45:56 crc kubenswrapper[4909]: I1002 19:45:56.876930 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb10e292-a690-4516-928f-d6d18d85426b" containerName="collect-profiles" Oct 02 19:45:56 crc kubenswrapper[4909]: I1002 19:45:56.879184 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:56 crc kubenswrapper[4909]: I1002 19:45:56.893678 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fbssz"] Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.023443 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwz7b\" (UniqueName: \"kubernetes.io/projected/bf4ae9ef-f515-462f-aed5-75d51be76c59-kube-api-access-wwz7b\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.023837 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-utilities\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.023917 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-catalog-content\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.126623 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-catalog-content\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.126856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwz7b\" (UniqueName: \"kubernetes.io/projected/bf4ae9ef-f515-462f-aed5-75d51be76c59-kube-api-access-wwz7b\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.126899 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-utilities\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.127156 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-catalog-content\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.127428 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-utilities\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.146568 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwz7b\" (UniqueName: \"kubernetes.io/projected/bf4ae9ef-f515-462f-aed5-75d51be76c59-kube-api-access-wwz7b\") pod \"certified-operators-fbssz\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.212932 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.809712 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fbssz"] Oct 02 19:45:57 crc kubenswrapper[4909]: I1002 19:45:57.942310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerStarted","Data":"024d3af4443cc7516dd2fe0233e4b35fab4eb809e5ae73919a10a82be7b2d15b"} Oct 02 19:45:58 crc kubenswrapper[4909]: I1002 19:45:58.608223 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:45:58 crc kubenswrapper[4909]: E1002 19:45:58.608766 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:45:58 crc kubenswrapper[4909]: I1002 19:45:58.954547 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerID="1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5" exitCode=0 Oct 02 19:45:58 crc kubenswrapper[4909]: I1002 19:45:58.954602 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerDied","Data":"1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5"} Oct 02 19:45:58 crc kubenswrapper[4909]: I1002 19:45:58.956550 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:46:00 crc kubenswrapper[4909]: I1002 19:46:00.976736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerStarted","Data":"e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27"} Oct 02 19:46:04 crc kubenswrapper[4909]: I1002 19:46:04.012768 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerID="e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27" exitCode=0 Oct 02 19:46:04 crc kubenswrapper[4909]: I1002 19:46:04.012857 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerDied","Data":"e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27"} Oct 02 19:46:05 crc kubenswrapper[4909]: I1002 19:46:05.029549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerStarted","Data":"139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b"} Oct 02 19:46:05 crc kubenswrapper[4909]: I1002 19:46:05.057753 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fbssz" podStartSLOduration=3.6041927190000003 podStartE2EDuration="9.057727335s" podCreationTimestamp="2025-10-02 19:45:56 +0000 UTC" firstStartedPulling="2025-10-02 19:45:58.956305687 +0000 UTC m=+5280.143801546" lastFinishedPulling="2025-10-02 19:46:04.409840263 +0000 UTC m=+5285.597336162" observedRunningTime="2025-10-02 19:46:05.051473429 +0000 UTC m=+5286.238969308" watchObservedRunningTime="2025-10-02 19:46:05.057727335 +0000 UTC m=+5286.245223204" Oct 02 19:46:07 crc kubenswrapper[4909]: I1002 19:46:07.213438 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:46:07 crc kubenswrapper[4909]: I1002 19:46:07.215088 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:46:07 crc kubenswrapper[4909]: I1002 19:46:07.262703 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:46:09 crc kubenswrapper[4909]: I1002 19:46:09.155771 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:46:09 crc kubenswrapper[4909]: I1002 19:46:09.220165 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fbssz"] Oct 02 19:46:10 crc kubenswrapper[4909]: I1002 19:46:10.608424 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:46:10 crc kubenswrapper[4909]: E1002 19:46:10.609061 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.090522 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fbssz" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="registry-server" containerID="cri-o://139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b" gracePeriod=2 Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.723141 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.812390 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-utilities\") pod \"bf4ae9ef-f515-462f-aed5-75d51be76c59\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.812498 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwz7b\" (UniqueName: \"kubernetes.io/projected/bf4ae9ef-f515-462f-aed5-75d51be76c59-kube-api-access-wwz7b\") pod \"bf4ae9ef-f515-462f-aed5-75d51be76c59\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.812553 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-catalog-content\") pod \"bf4ae9ef-f515-462f-aed5-75d51be76c59\" (UID: \"bf4ae9ef-f515-462f-aed5-75d51be76c59\") " Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.815416 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-utilities" (OuterVolumeSpecName: "utilities") pod "bf4ae9ef-f515-462f-aed5-75d51be76c59" (UID: "bf4ae9ef-f515-462f-aed5-75d51be76c59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.819183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4ae9ef-f515-462f-aed5-75d51be76c59-kube-api-access-wwz7b" (OuterVolumeSpecName: "kube-api-access-wwz7b") pod "bf4ae9ef-f515-462f-aed5-75d51be76c59" (UID: "bf4ae9ef-f515-462f-aed5-75d51be76c59"). InnerVolumeSpecName "kube-api-access-wwz7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.867477 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf4ae9ef-f515-462f-aed5-75d51be76c59" (UID: "bf4ae9ef-f515-462f-aed5-75d51be76c59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.914710 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.914742 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwz7b\" (UniqueName: \"kubernetes.io/projected/bf4ae9ef-f515-462f-aed5-75d51be76c59-kube-api-access-wwz7b\") on node \"crc\" DevicePath \"\"" Oct 02 19:46:11 crc kubenswrapper[4909]: I1002 19:46:11.914752 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4ae9ef-f515-462f-aed5-75d51be76c59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.114251 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerID="139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b" exitCode=0 Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.114299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerDied","Data":"139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b"} Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.114317 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fbssz" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.114328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fbssz" event={"ID":"bf4ae9ef-f515-462f-aed5-75d51be76c59","Type":"ContainerDied","Data":"024d3af4443cc7516dd2fe0233e4b35fab4eb809e5ae73919a10a82be7b2d15b"} Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.114353 4909 scope.go:117] "RemoveContainer" containerID="139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.149948 4909 scope.go:117] "RemoveContainer" containerID="e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.153528 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fbssz"] Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.168531 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fbssz"] Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.715342 4909 scope.go:117] "RemoveContainer" containerID="1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.787811 4909 scope.go:117] "RemoveContainer" containerID="139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b" Oct 02 19:46:12 crc kubenswrapper[4909]: E1002 19:46:12.788336 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b\": container with ID starting with 139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b not found: ID does not exist" containerID="139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.788383 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b"} err="failed to get container status \"139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b\": rpc error: code = NotFound desc = could not find container \"139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b\": container with ID starting with 139d5d35f885081c7e7231198aa06a70f22586994e8852f201516b1faa628b9b not found: ID does not exist" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.788417 4909 scope.go:117] "RemoveContainer" containerID="e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27" Oct 02 19:46:12 crc kubenswrapper[4909]: E1002 19:46:12.788779 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27\": container with ID starting with e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27 not found: ID does not exist" containerID="e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.788812 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27"} err="failed to get container status \"e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27\": rpc error: code = NotFound desc = could not find container \"e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27\": container with ID starting with e4ded8a3a9d69721e404598258d046be8721a5fb2b387d407d4b1dd6b5969c27 not found: ID does not exist" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.788833 4909 scope.go:117] "RemoveContainer" containerID="1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5" Oct 02 19:46:12 crc kubenswrapper[4909]: E1002 19:46:12.798782 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5\": container with ID starting with 1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5 not found: ID does not exist" containerID="1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5" Oct 02 19:46:12 crc kubenswrapper[4909]: I1002 19:46:12.798864 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5"} err="failed to get container status \"1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5\": rpc error: code = NotFound desc = could not find container \"1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5\": container with ID starting with 1c02f58e1a222ed904545d1ea1d8af2e084ce1e56792cf5daa6056b5d34e7dc5 not found: ID does not exist" Oct 02 19:46:13 crc kubenswrapper[4909]: I1002 19:46:13.620669 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" path="/var/lib/kubelet/pods/bf4ae9ef-f515-462f-aed5-75d51be76c59/volumes" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.685885 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qnwm9"] Oct 02 19:46:19 crc kubenswrapper[4909]: E1002 19:46:19.686957 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="extract-utilities" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.686970 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="extract-utilities" Oct 02 19:46:19 crc kubenswrapper[4909]: E1002 19:46:19.686986 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="registry-server" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.686992 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="registry-server" Oct 02 19:46:19 crc kubenswrapper[4909]: E1002 19:46:19.687010 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="extract-content" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.687017 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="extract-content" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.687262 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4ae9ef-f515-462f-aed5-75d51be76c59" containerName="registry-server" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.688889 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.699242 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnwm9"] Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.804696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-catalog-content\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.804775 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vhhc\" (UniqueName: \"kubernetes.io/projected/3b504552-4ee1-4691-baf9-068c5ad897a6-kube-api-access-4vhhc\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.804805 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-utilities\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.907914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-catalog-content\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.908086 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vhhc\" (UniqueName: \"kubernetes.io/projected/3b504552-4ee1-4691-baf9-068c5ad897a6-kube-api-access-4vhhc\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.908128 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-utilities\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.908604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-catalog-content\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.908894 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-utilities\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:19 crc kubenswrapper[4909]: I1002 19:46:19.940563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vhhc\" (UniqueName: \"kubernetes.io/projected/3b504552-4ee1-4691-baf9-068c5ad897a6-kube-api-access-4vhhc\") pod \"redhat-marketplace-qnwm9\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:20 crc kubenswrapper[4909]: I1002 19:46:20.058567 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:20 crc kubenswrapper[4909]: I1002 19:46:20.554659 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnwm9"] Oct 02 19:46:21 crc kubenswrapper[4909]: I1002 19:46:21.253595 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerID="a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda" exitCode=0 Oct 02 19:46:21 crc kubenswrapper[4909]: I1002 19:46:21.253645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnwm9" event={"ID":"3b504552-4ee1-4691-baf9-068c5ad897a6","Type":"ContainerDied","Data":"a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda"} Oct 02 19:46:21 crc kubenswrapper[4909]: I1002 19:46:21.253955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnwm9" event={"ID":"3b504552-4ee1-4691-baf9-068c5ad897a6","Type":"ContainerStarted","Data":"035d41f5acae82d6882af4ea0da1b13855b8c7bc7ae29091ba3123cee835df2f"} Oct 02 19:46:22 crc kubenswrapper[4909]: I1002 19:46:22.608478 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:46:22 crc kubenswrapper[4909]: E1002 19:46:22.608823 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:46:23 crc kubenswrapper[4909]: I1002 19:46:23.285807 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerID="6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5" exitCode=0 Oct 02 19:46:23 crc kubenswrapper[4909]: I1002 19:46:23.285946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnwm9" event={"ID":"3b504552-4ee1-4691-baf9-068c5ad897a6","Type":"ContainerDied","Data":"6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5"} Oct 02 19:46:24 crc kubenswrapper[4909]: I1002 19:46:24.306723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnwm9" event={"ID":"3b504552-4ee1-4691-baf9-068c5ad897a6","Type":"ContainerStarted","Data":"5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51"} Oct 02 19:46:24 crc kubenswrapper[4909]: I1002 19:46:24.332558 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qnwm9" podStartSLOduration=2.858380434 podStartE2EDuration="5.332537991s" podCreationTimestamp="2025-10-02 19:46:19 +0000 UTC" firstStartedPulling="2025-10-02 19:46:21.255433653 +0000 UTC m=+5302.442929512" lastFinishedPulling="2025-10-02 19:46:23.72959119 +0000 UTC m=+5304.917087069" observedRunningTime="2025-10-02 19:46:24.327567295 +0000 UTC m=+5305.515063184" watchObservedRunningTime="2025-10-02 19:46:24.332537991 +0000 UTC m=+5305.520033860" Oct 02 19:46:30 crc kubenswrapper[4909]: I1002 19:46:30.059673 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:30 crc kubenswrapper[4909]: I1002 19:46:30.060362 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:30 crc kubenswrapper[4909]: I1002 19:46:30.113803 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:30 crc kubenswrapper[4909]: I1002 19:46:30.475312 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:30 crc kubenswrapper[4909]: I1002 19:46:30.552229 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnwm9"] Oct 02 19:46:32 crc kubenswrapper[4909]: I1002 19:46:32.405216 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qnwm9" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="registry-server" containerID="cri-o://5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51" gracePeriod=2 Oct 02 19:46:32 crc kubenswrapper[4909]: I1002 19:46:32.947499 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.016160 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-catalog-content\") pod \"3b504552-4ee1-4691-baf9-068c5ad897a6\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.016236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vhhc\" (UniqueName: \"kubernetes.io/projected/3b504552-4ee1-4691-baf9-068c5ad897a6-kube-api-access-4vhhc\") pod \"3b504552-4ee1-4691-baf9-068c5ad897a6\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.016365 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-utilities\") pod \"3b504552-4ee1-4691-baf9-068c5ad897a6\" (UID: \"3b504552-4ee1-4691-baf9-068c5ad897a6\") " Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.018123 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-utilities" (OuterVolumeSpecName: "utilities") pod "3b504552-4ee1-4691-baf9-068c5ad897a6" (UID: "3b504552-4ee1-4691-baf9-068c5ad897a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.025382 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b504552-4ee1-4691-baf9-068c5ad897a6-kube-api-access-4vhhc" (OuterVolumeSpecName: "kube-api-access-4vhhc") pod "3b504552-4ee1-4691-baf9-068c5ad897a6" (UID: "3b504552-4ee1-4691-baf9-068c5ad897a6"). InnerVolumeSpecName "kube-api-access-4vhhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.038501 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b504552-4ee1-4691-baf9-068c5ad897a6" (UID: "3b504552-4ee1-4691-baf9-068c5ad897a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.118990 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.119018 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vhhc\" (UniqueName: \"kubernetes.io/projected/3b504552-4ee1-4691-baf9-068c5ad897a6-kube-api-access-4vhhc\") on node \"crc\" DevicePath \"\"" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.119040 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b504552-4ee1-4691-baf9-068c5ad897a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.421676 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerID="5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51" exitCode=0 Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.421714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnwm9" event={"ID":"3b504552-4ee1-4691-baf9-068c5ad897a6","Type":"ContainerDied","Data":"5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51"} Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.421738 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnwm9" event={"ID":"3b504552-4ee1-4691-baf9-068c5ad897a6","Type":"ContainerDied","Data":"035d41f5acae82d6882af4ea0da1b13855b8c7bc7ae29091ba3123cee835df2f"} Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.421758 4909 scope.go:117] "RemoveContainer" containerID="5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.423124 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnwm9" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.466849 4909 scope.go:117] "RemoveContainer" containerID="6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.469632 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnwm9"] Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.483365 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnwm9"] Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.501748 4909 scope.go:117] "RemoveContainer" containerID="a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.543947 4909 scope.go:117] "RemoveContainer" containerID="5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51" Oct 02 19:46:33 crc kubenswrapper[4909]: E1002 19:46:33.544895 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51\": container with ID starting with 5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51 not found: ID does not exist" containerID="5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.544929 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51"} err="failed to get container status \"5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51\": rpc error: code = NotFound desc = could not find container \"5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51\": container with ID starting with 5a1efd87d00ece92d276ca161c5263ba2a2705372dd3b9af4c70884724a23f51 not found: ID does not exist" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.544951 4909 scope.go:117] "RemoveContainer" containerID="6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5" Oct 02 19:46:33 crc kubenswrapper[4909]: E1002 19:46:33.545437 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5\": container with ID starting with 6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5 not found: ID does not exist" containerID="6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.545554 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5"} err="failed to get container status \"6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5\": rpc error: code = NotFound desc = could not find container \"6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5\": container with ID starting with 6cb94ce870c07a1e8b3ada463a8796386ca4ac895578399df84f0144731abca5 not found: ID does not exist" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.545669 4909 scope.go:117] "RemoveContainer" containerID="a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda" Oct 02 19:46:33 crc kubenswrapper[4909]: E1002 19:46:33.546264 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda\": container with ID starting with a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda not found: ID does not exist" containerID="a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.546353 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda"} err="failed to get container status \"a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda\": rpc error: code = NotFound desc = could not find container \"a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda\": container with ID starting with a794dd146767d80fba2047401a02e540119d04ab3e06bf8d6fe9405961a3afda not found: ID does not exist" Oct 02 19:46:33 crc kubenswrapper[4909]: I1002 19:46:33.621410 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" path="/var/lib/kubelet/pods/3b504552-4ee1-4691-baf9-068c5ad897a6/volumes" Oct 02 19:46:37 crc kubenswrapper[4909]: I1002 19:46:37.609463 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:46:37 crc kubenswrapper[4909]: E1002 19:46:37.610451 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:46:45 crc kubenswrapper[4909]: E1002 19:46:45.546704 4909 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:49820->38.102.83.129:45429: write tcp 38.102.83.129:49820->38.102.83.129:45429: write: connection reset by peer Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.898508 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s826g"] Oct 02 19:46:45 crc kubenswrapper[4909]: E1002 19:46:45.899753 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="registry-server" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.899797 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="registry-server" Oct 02 19:46:45 crc kubenswrapper[4909]: E1002 19:46:45.899826 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="extract-utilities" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.899873 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="extract-utilities" Oct 02 19:46:45 crc kubenswrapper[4909]: E1002 19:46:45.899909 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="extract-content" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.899922 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="extract-content" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.900321 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b504552-4ee1-4691-baf9-068c5ad897a6" containerName="registry-server" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.903247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.913382 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s826g"] Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.916382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvh2n\" (UniqueName: \"kubernetes.io/projected/bb575cba-175c-4853-9541-73906a53af37-kube-api-access-lvh2n\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.916542 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-catalog-content\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:45 crc kubenswrapper[4909]: I1002 19:46:45.916942 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-utilities\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.019559 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-utilities\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.019678 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvh2n\" (UniqueName: \"kubernetes.io/projected/bb575cba-175c-4853-9541-73906a53af37-kube-api-access-lvh2n\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.019739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-catalog-content\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.020060 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-utilities\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.020258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-catalog-content\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.046402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvh2n\" (UniqueName: \"kubernetes.io/projected/bb575cba-175c-4853-9541-73906a53af37-kube-api-access-lvh2n\") pod \"community-operators-s826g\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.232593 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:46 crc kubenswrapper[4909]: I1002 19:46:46.824932 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s826g"] Oct 02 19:46:47 crc kubenswrapper[4909]: I1002 19:46:47.607147 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb575cba-175c-4853-9541-73906a53af37" containerID="91d5f34b44d53b27bc44776b34b95ab6e8aae0c854f626f98db2cc414ed4bf2a" exitCode=0 Oct 02 19:46:47 crc kubenswrapper[4909]: I1002 19:46:47.607200 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerDied","Data":"91d5f34b44d53b27bc44776b34b95ab6e8aae0c854f626f98db2cc414ed4bf2a"} Oct 02 19:46:47 crc kubenswrapper[4909]: I1002 19:46:47.645511 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerStarted","Data":"d3c6ed9a586d07df940661b018a4d312c84b392fcd8faadf803c7ed78aba8fe8"} Oct 02 19:46:49 crc kubenswrapper[4909]: I1002 19:46:49.625348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerStarted","Data":"067c523562dd3a6fd3489f27b7b6ee54baa6e824c4baaf9ff502d20bdcb45d41"} Oct 02 19:46:51 crc kubenswrapper[4909]: I1002 19:46:51.674809 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb575cba-175c-4853-9541-73906a53af37" containerID="067c523562dd3a6fd3489f27b7b6ee54baa6e824c4baaf9ff502d20bdcb45d41" exitCode=0 Oct 02 19:46:51 crc kubenswrapper[4909]: I1002 19:46:51.674879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerDied","Data":"067c523562dd3a6fd3489f27b7b6ee54baa6e824c4baaf9ff502d20bdcb45d41"} Oct 02 19:46:52 crc kubenswrapper[4909]: I1002 19:46:52.610316 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:46:52 crc kubenswrapper[4909]: E1002 19:46:52.611389 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:46:52 crc kubenswrapper[4909]: I1002 19:46:52.690266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerStarted","Data":"5a08edb0180328cbce1a6fd7a4ac749dd42a9b883412c1cdce04ed6489935760"} Oct 02 19:46:52 crc kubenswrapper[4909]: I1002 19:46:52.728252 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s826g" podStartSLOduration=3.234512726 podStartE2EDuration="7.728220324s" podCreationTimestamp="2025-10-02 19:46:45 +0000 UTC" firstStartedPulling="2025-10-02 19:46:47.609181507 +0000 UTC m=+5328.796677366" lastFinishedPulling="2025-10-02 19:46:52.102889105 +0000 UTC m=+5333.290384964" observedRunningTime="2025-10-02 19:46:52.714738023 +0000 UTC m=+5333.902233912" watchObservedRunningTime="2025-10-02 19:46:52.728220324 +0000 UTC m=+5333.915716193" Oct 02 19:46:56 crc kubenswrapper[4909]: I1002 19:46:56.233077 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:56 crc kubenswrapper[4909]: I1002 19:46:56.235912 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:56 crc kubenswrapper[4909]: I1002 19:46:56.301963 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:57 crc kubenswrapper[4909]: I1002 19:46:57.810677 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:46:57 crc kubenswrapper[4909]: I1002 19:46:57.870818 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s826g"] Oct 02 19:46:59 crc kubenswrapper[4909]: I1002 19:46:59.767665 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s826g" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="registry-server" containerID="cri-o://5a08edb0180328cbce1a6fd7a4ac749dd42a9b883412c1cdce04ed6489935760" gracePeriod=2 Oct 02 19:47:00 crc kubenswrapper[4909]: I1002 19:47:00.781472 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb575cba-175c-4853-9541-73906a53af37" containerID="5a08edb0180328cbce1a6fd7a4ac749dd42a9b883412c1cdce04ed6489935760" exitCode=0 Oct 02 19:47:00 crc kubenswrapper[4909]: I1002 19:47:00.781577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerDied","Data":"5a08edb0180328cbce1a6fd7a4ac749dd42a9b883412c1cdce04ed6489935760"} Oct 02 19:47:00 crc kubenswrapper[4909]: I1002 19:47:00.781955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s826g" event={"ID":"bb575cba-175c-4853-9541-73906a53af37","Type":"ContainerDied","Data":"d3c6ed9a586d07df940661b018a4d312c84b392fcd8faadf803c7ed78aba8fe8"} Oct 02 19:47:00 crc kubenswrapper[4909]: I1002 19:47:00.781977 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c6ed9a586d07df940661b018a4d312c84b392fcd8faadf803c7ed78aba8fe8" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.025681 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.092630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-catalog-content\") pod \"bb575cba-175c-4853-9541-73906a53af37\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.092723 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvh2n\" (UniqueName: \"kubernetes.io/projected/bb575cba-175c-4853-9541-73906a53af37-kube-api-access-lvh2n\") pod \"bb575cba-175c-4853-9541-73906a53af37\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.093117 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-utilities\") pod \"bb575cba-175c-4853-9541-73906a53af37\" (UID: \"bb575cba-175c-4853-9541-73906a53af37\") " Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.096805 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-utilities" (OuterVolumeSpecName: "utilities") pod "bb575cba-175c-4853-9541-73906a53af37" (UID: "bb575cba-175c-4853-9541-73906a53af37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.104796 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb575cba-175c-4853-9541-73906a53af37-kube-api-access-lvh2n" (OuterVolumeSpecName: "kube-api-access-lvh2n") pod "bb575cba-175c-4853-9541-73906a53af37" (UID: "bb575cba-175c-4853-9541-73906a53af37"). InnerVolumeSpecName "kube-api-access-lvh2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.153186 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb575cba-175c-4853-9541-73906a53af37" (UID: "bb575cba-175c-4853-9541-73906a53af37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.195791 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.195828 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb575cba-175c-4853-9541-73906a53af37-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.195844 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvh2n\" (UniqueName: \"kubernetes.io/projected/bb575cba-175c-4853-9541-73906a53af37-kube-api-access-lvh2n\") on node \"crc\" DevicePath \"\"" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.795343 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s826g" Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.832304 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s826g"] Oct 02 19:47:01 crc kubenswrapper[4909]: I1002 19:47:01.852482 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s826g"] Oct 02 19:47:03 crc kubenswrapper[4909]: I1002 19:47:03.609280 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:47:03 crc kubenswrapper[4909]: E1002 19:47:03.609902 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:47:03 crc kubenswrapper[4909]: I1002 19:47:03.625601 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb575cba-175c-4853-9541-73906a53af37" path="/var/lib/kubelet/pods/bb575cba-175c-4853-9541-73906a53af37/volumes" Oct 02 19:47:16 crc kubenswrapper[4909]: I1002 19:47:16.608569 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:47:16 crc kubenswrapper[4909]: E1002 19:47:16.609578 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:47:30 crc kubenswrapper[4909]: I1002 19:47:30.609007 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:47:30 crc kubenswrapper[4909]: E1002 19:47:30.610470 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:47:44 crc kubenswrapper[4909]: I1002 19:47:44.609166 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:47:44 crc kubenswrapper[4909]: E1002 19:47:44.610180 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:47:58 crc kubenswrapper[4909]: I1002 19:47:58.609010 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:47:59 crc kubenswrapper[4909]: I1002 19:47:59.484970 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"34e0f2c01adf138e2ae315ef475ce1d2f4d9168bfd9fd1a835961dc232a54729"} Oct 02 19:48:55 crc kubenswrapper[4909]: I1002 19:48:55.067579 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-szqm7"] Oct 02 19:48:55 crc kubenswrapper[4909]: I1002 19:48:55.078387 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-szqm7"] Oct 02 19:48:55 crc kubenswrapper[4909]: I1002 19:48:55.625226 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8700ec28-b969-413d-8c1b-40f72280eea2" path="/var/lib/kubelet/pods/8700ec28-b969-413d-8c1b-40f72280eea2/volumes" Oct 02 19:49:14 crc kubenswrapper[4909]: I1002 19:49:14.035114 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-bdaa-account-create-klvbh"] Oct 02 19:49:14 crc kubenswrapper[4909]: I1002 19:49:14.047524 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-bdaa-account-create-klvbh"] Oct 02 19:49:15 crc kubenswrapper[4909]: I1002 19:49:15.633680 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525e15bf-102a-4a3d-b7e1-4184a27bec88" path="/var/lib/kubelet/pods/525e15bf-102a-4a3d-b7e1-4184a27bec88/volumes" Oct 02 19:49:26 crc kubenswrapper[4909]: I1002 19:49:26.527834 4909 scope.go:117] "RemoveContainer" containerID="8d06acbf2ef8582f76d415d0a8d9b9e002819efa143f4d392d6744cfe968c910" Oct 02 19:49:26 crc kubenswrapper[4909]: I1002 19:49:26.586506 4909 scope.go:117] "RemoveContainer" containerID="1ed545246c69b8ef10256c2b695a9de8a7249188ee8027f591a4855311515190" Oct 02 19:49:34 crc kubenswrapper[4909]: I1002 19:49:34.043717 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-nfhmm"] Oct 02 19:49:34 crc kubenswrapper[4909]: I1002 19:49:34.056503 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-nfhmm"] Oct 02 19:49:35 crc kubenswrapper[4909]: I1002 19:49:35.620935 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f857eeb-83ae-4626-ae42-a4d52389bf79" path="/var/lib/kubelet/pods/3f857eeb-83ae-4626-ae42-a4d52389bf79/volumes" Oct 02 19:50:23 crc kubenswrapper[4909]: I1002 19:50:23.054164 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:50:23 crc kubenswrapper[4909]: I1002 19:50:23.054855 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:50:26 crc kubenswrapper[4909]: I1002 19:50:26.716454 4909 scope.go:117] "RemoveContainer" containerID="d5f9b9dc3717da5f5e77da952a5f53011afcc50d33014b84cdd4e0980ed90f22" Oct 02 19:50:53 crc kubenswrapper[4909]: I1002 19:50:53.055622 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:50:53 crc kubenswrapper[4909]: I1002 19:50:53.056388 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.054196 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.054726 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.054776 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.055659 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34e0f2c01adf138e2ae315ef475ce1d2f4d9168bfd9fd1a835961dc232a54729"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.055725 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://34e0f2c01adf138e2ae315ef475ce1d2f4d9168bfd9fd1a835961dc232a54729" gracePeriod=600 Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.158004 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z9h4f"] Oct 02 19:51:23 crc kubenswrapper[4909]: E1002 19:51:23.158548 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="extract-utilities" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.158568 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="extract-utilities" Oct 02 19:51:23 crc kubenswrapper[4909]: E1002 19:51:23.158619 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="extract-content" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.158628 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="extract-content" Oct 02 19:51:23 crc kubenswrapper[4909]: E1002 19:51:23.158664 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="registry-server" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.158674 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="registry-server" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.158922 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb575cba-175c-4853-9541-73906a53af37" containerName="registry-server" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.161055 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.172928 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z9h4f"] Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.182314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvjst\" (UniqueName: \"kubernetes.io/projected/19e5be76-b628-4d8b-9362-5c15fb3c16f1-kube-api-access-rvjst\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.182690 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-utilities\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.183007 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-catalog-content\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.286084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-catalog-content\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.286211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjst\" (UniqueName: \"kubernetes.io/projected/19e5be76-b628-4d8b-9362-5c15fb3c16f1-kube-api-access-rvjst\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.286368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-utilities\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.287051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-utilities\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.287349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-catalog-content\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.311527 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvjst\" (UniqueName: \"kubernetes.io/projected/19e5be76-b628-4d8b-9362-5c15fb3c16f1-kube-api-access-rvjst\") pod \"redhat-operators-z9h4f\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.575645 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.897282 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="34e0f2c01adf138e2ae315ef475ce1d2f4d9168bfd9fd1a835961dc232a54729" exitCode=0 Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.897362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"34e0f2c01adf138e2ae315ef475ce1d2f4d9168bfd9fd1a835961dc232a54729"} Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.897588 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc"} Oct 02 19:51:23 crc kubenswrapper[4909]: I1002 19:51:23.897610 4909 scope.go:117] "RemoveContainer" containerID="bec1adbb1ee92cb73743e4f07eddf137579d12e1e405b27bb70af248d1a42d40" Oct 02 19:51:24 crc kubenswrapper[4909]: I1002 19:51:24.077727 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z9h4f"] Oct 02 19:51:24 crc kubenswrapper[4909]: W1002 19:51:24.080147 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e5be76_b628_4d8b_9362_5c15fb3c16f1.slice/crio-0997c6d70dccdde6008acdb9ff8a89fd26065e0e5487d020416fee164067a608 WatchSource:0}: Error finding container 0997c6d70dccdde6008acdb9ff8a89fd26065e0e5487d020416fee164067a608: Status 404 returned error can't find the container with id 0997c6d70dccdde6008acdb9ff8a89fd26065e0e5487d020416fee164067a608 Oct 02 19:51:24 crc kubenswrapper[4909]: I1002 19:51:24.922087 4909 generic.go:334] "Generic (PLEG): container finished" podID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerID="b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3" exitCode=0 Oct 02 19:51:24 crc kubenswrapper[4909]: I1002 19:51:24.922597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerDied","Data":"b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3"} Oct 02 19:51:24 crc kubenswrapper[4909]: I1002 19:51:24.922634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerStarted","Data":"0997c6d70dccdde6008acdb9ff8a89fd26065e0e5487d020416fee164067a608"} Oct 02 19:51:24 crc kubenswrapper[4909]: I1002 19:51:24.926852 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:51:26 crc kubenswrapper[4909]: I1002 19:51:26.946378 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerStarted","Data":"13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2"} Oct 02 19:51:30 crc kubenswrapper[4909]: I1002 19:51:30.991458 4909 generic.go:334] "Generic (PLEG): container finished" podID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerID="13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2" exitCode=0 Oct 02 19:51:30 crc kubenswrapper[4909]: I1002 19:51:30.991994 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerDied","Data":"13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2"} Oct 02 19:51:32 crc kubenswrapper[4909]: I1002 19:51:32.003741 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerStarted","Data":"21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93"} Oct 02 19:51:32 crc kubenswrapper[4909]: I1002 19:51:32.027539 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z9h4f" podStartSLOduration=2.51366053 podStartE2EDuration="9.027523387s" podCreationTimestamp="2025-10-02 19:51:23 +0000 UTC" firstStartedPulling="2025-10-02 19:51:24.926508892 +0000 UTC m=+5606.114004761" lastFinishedPulling="2025-10-02 19:51:31.440371759 +0000 UTC m=+5612.627867618" observedRunningTime="2025-10-02 19:51:32.023650437 +0000 UTC m=+5613.211146296" watchObservedRunningTime="2025-10-02 19:51:32.027523387 +0000 UTC m=+5613.215019246" Oct 02 19:51:33 crc kubenswrapper[4909]: I1002 19:51:33.576474 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:33 crc kubenswrapper[4909]: I1002 19:51:33.577190 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:34 crc kubenswrapper[4909]: I1002 19:51:34.655403 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z9h4f" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="registry-server" probeResult="failure" output=< Oct 02 19:51:34 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 19:51:34 crc kubenswrapper[4909]: > Oct 02 19:51:43 crc kubenswrapper[4909]: I1002 19:51:43.643764 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:43 crc kubenswrapper[4909]: I1002 19:51:43.704628 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:43 crc kubenswrapper[4909]: I1002 19:51:43.892116 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z9h4f"] Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.143364 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z9h4f" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="registry-server" containerID="cri-o://21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93" gracePeriod=2 Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.781189 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.832340 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-utilities\") pod \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.832822 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-catalog-content\") pod \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.832947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvjst\" (UniqueName: \"kubernetes.io/projected/19e5be76-b628-4d8b-9362-5c15fb3c16f1-kube-api-access-rvjst\") pod \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\" (UID: \"19e5be76-b628-4d8b-9362-5c15fb3c16f1\") " Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.833386 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-utilities" (OuterVolumeSpecName: "utilities") pod "19e5be76-b628-4d8b-9362-5c15fb3c16f1" (UID: "19e5be76-b628-4d8b-9362-5c15fb3c16f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.833727 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.840437 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e5be76-b628-4d8b-9362-5c15fb3c16f1-kube-api-access-rvjst" (OuterVolumeSpecName: "kube-api-access-rvjst") pod "19e5be76-b628-4d8b-9362-5c15fb3c16f1" (UID: "19e5be76-b628-4d8b-9362-5c15fb3c16f1"). InnerVolumeSpecName "kube-api-access-rvjst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.936285 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvjst\" (UniqueName: \"kubernetes.io/projected/19e5be76-b628-4d8b-9362-5c15fb3c16f1-kube-api-access-rvjst\") on node \"crc\" DevicePath \"\"" Oct 02 19:51:45 crc kubenswrapper[4909]: I1002 19:51:45.950102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19e5be76-b628-4d8b-9362-5c15fb3c16f1" (UID: "19e5be76-b628-4d8b-9362-5c15fb3c16f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.038732 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e5be76-b628-4d8b-9362-5c15fb3c16f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.158199 4909 generic.go:334] "Generic (PLEG): container finished" podID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerID="21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93" exitCode=0 Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.158261 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z9h4f" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.158300 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerDied","Data":"21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93"} Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.158704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z9h4f" event={"ID":"19e5be76-b628-4d8b-9362-5c15fb3c16f1","Type":"ContainerDied","Data":"0997c6d70dccdde6008acdb9ff8a89fd26065e0e5487d020416fee164067a608"} Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.158742 4909 scope.go:117] "RemoveContainer" containerID="21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.190159 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z9h4f"] Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.193907 4909 scope.go:117] "RemoveContainer" containerID="13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.199628 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z9h4f"] Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.234429 4909 scope.go:117] "RemoveContainer" containerID="b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.269773 4909 scope.go:117] "RemoveContainer" containerID="21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93" Oct 02 19:51:46 crc kubenswrapper[4909]: E1002 19:51:46.270334 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93\": container with ID starting with 21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93 not found: ID does not exist" containerID="21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.270400 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93"} err="failed to get container status \"21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93\": rpc error: code = NotFound desc = could not find container \"21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93\": container with ID starting with 21bd8cd59200c9fb060de655b9eb3041776188f6590a78daf1451eb17a552b93 not found: ID does not exist" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.270445 4909 scope.go:117] "RemoveContainer" containerID="13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2" Oct 02 19:51:46 crc kubenswrapper[4909]: E1002 19:51:46.270778 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2\": container with ID starting with 13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2 not found: ID does not exist" containerID="13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.270822 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2"} err="failed to get container status \"13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2\": rpc error: code = NotFound desc = could not find container \"13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2\": container with ID starting with 13574d785384aa087fcd5b92bb9b2fa774ea4a6b9b0c0e6434ba709ed45339f2 not found: ID does not exist" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.270848 4909 scope.go:117] "RemoveContainer" containerID="b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3" Oct 02 19:51:46 crc kubenswrapper[4909]: E1002 19:51:46.271135 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3\": container with ID starting with b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3 not found: ID does not exist" containerID="b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3" Oct 02 19:51:46 crc kubenswrapper[4909]: I1002 19:51:46.271175 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3"} err="failed to get container status \"b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3\": rpc error: code = NotFound desc = could not find container \"b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3\": container with ID starting with b5e0e8480e374ba5b0d5a79c09d31c4f660254eca344f851e7a679b8e67695f3 not found: ID does not exist" Oct 02 19:51:47 crc kubenswrapper[4909]: I1002 19:51:47.621969 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" path="/var/lib/kubelet/pods/19e5be76-b628-4d8b-9362-5c15fb3c16f1/volumes" Oct 02 19:53:23 crc kubenswrapper[4909]: I1002 19:53:23.054300 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:53:23 crc kubenswrapper[4909]: I1002 19:53:23.055049 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:53:26 crc kubenswrapper[4909]: I1002 19:53:26.855292 4909 scope.go:117] "RemoveContainer" containerID="5a08edb0180328cbce1a6fd7a4ac749dd42a9b883412c1cdce04ed6489935760" Oct 02 19:53:26 crc kubenswrapper[4909]: I1002 19:53:26.878576 4909 scope.go:117] "RemoveContainer" containerID="067c523562dd3a6fd3489f27b7b6ee54baa6e824c4baaf9ff502d20bdcb45d41" Oct 02 19:53:26 crc kubenswrapper[4909]: I1002 19:53:26.898831 4909 scope.go:117] "RemoveContainer" containerID="91d5f34b44d53b27bc44776b34b95ab6e8aae0c854f626f98db2cc414ed4bf2a" Oct 02 19:53:53 crc kubenswrapper[4909]: I1002 19:53:53.054259 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:53:53 crc kubenswrapper[4909]: I1002 19:53:53.054875 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.054514 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.055150 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.055203 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.056172 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.056237 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" gracePeriod=600 Oct 02 19:54:23 crc kubenswrapper[4909]: E1002 19:54:23.196922 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.971169 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" exitCode=0 Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.971219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc"} Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.971256 4909 scope.go:117] "RemoveContainer" containerID="34e0f2c01adf138e2ae315ef475ce1d2f4d9168bfd9fd1a835961dc232a54729" Oct 02 19:54:23 crc kubenswrapper[4909]: I1002 19:54:23.972214 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:54:23 crc kubenswrapper[4909]: E1002 19:54:23.972861 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:54:36 crc kubenswrapper[4909]: I1002 19:54:36.609778 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:54:36 crc kubenswrapper[4909]: E1002 19:54:36.610863 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:54:48 crc kubenswrapper[4909]: I1002 19:54:48.608899 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:54:48 crc kubenswrapper[4909]: E1002 19:54:48.611165 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:55:02 crc kubenswrapper[4909]: I1002 19:55:02.609108 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:55:02 crc kubenswrapper[4909]: E1002 19:55:02.610561 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:55:14 crc kubenswrapper[4909]: I1002 19:55:14.608904 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:55:14 crc kubenswrapper[4909]: E1002 19:55:14.609796 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:55:27 crc kubenswrapper[4909]: I1002 19:55:27.608069 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:55:27 crc kubenswrapper[4909]: E1002 19:55:27.608860 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:55:38 crc kubenswrapper[4909]: I1002 19:55:38.609102 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:55:38 crc kubenswrapper[4909]: E1002 19:55:38.609924 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:55:50 crc kubenswrapper[4909]: I1002 19:55:50.609306 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:55:50 crc kubenswrapper[4909]: E1002 19:55:50.610526 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:56:05 crc kubenswrapper[4909]: I1002 19:56:05.610973 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:56:05 crc kubenswrapper[4909]: E1002 19:56:05.611973 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:56:16 crc kubenswrapper[4909]: I1002 19:56:16.610010 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:56:16 crc kubenswrapper[4909]: E1002 19:56:16.611415 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:56:29 crc kubenswrapper[4909]: I1002 19:56:29.627367 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:56:29 crc kubenswrapper[4909]: E1002 19:56:29.628701 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:56:42 crc kubenswrapper[4909]: I1002 19:56:42.609009 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:56:42 crc kubenswrapper[4909]: E1002 19:56:42.609762 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.155101 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-26xcw"] Oct 02 19:56:48 crc kubenswrapper[4909]: E1002 19:56:48.156188 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="extract-utilities" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.156209 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="extract-utilities" Oct 02 19:56:48 crc kubenswrapper[4909]: E1002 19:56:48.156231 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="registry-server" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.156239 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="registry-server" Oct 02 19:56:48 crc kubenswrapper[4909]: E1002 19:56:48.156271 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="extract-content" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.156280 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="extract-content" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.156546 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e5be76-b628-4d8b-9362-5c15fb3c16f1" containerName="registry-server" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.158523 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.196733 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-26xcw"] Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.258367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-utilities\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.258453 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-catalog-content\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.258562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx47d\" (UniqueName: \"kubernetes.io/projected/d71b0b2b-e263-44b0-9d27-05f0757dfd45-kube-api-access-bx47d\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.360647 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx47d\" (UniqueName: \"kubernetes.io/projected/d71b0b2b-e263-44b0-9d27-05f0757dfd45-kube-api-access-bx47d\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.360776 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-utilities\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.360910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-catalog-content\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.361355 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-utilities\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.361404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-catalog-content\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.387108 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx47d\" (UniqueName: \"kubernetes.io/projected/d71b0b2b-e263-44b0-9d27-05f0757dfd45-kube-api-access-bx47d\") pod \"community-operators-26xcw\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:48 crc kubenswrapper[4909]: I1002 19:56:48.487690 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:49 crc kubenswrapper[4909]: I1002 19:56:49.022178 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-26xcw"] Oct 02 19:56:49 crc kubenswrapper[4909]: I1002 19:56:49.756095 4909 generic.go:334] "Generic (PLEG): container finished" podID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerID="96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42" exitCode=0 Oct 02 19:56:49 crc kubenswrapper[4909]: I1002 19:56:49.756239 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerDied","Data":"96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42"} Oct 02 19:56:49 crc kubenswrapper[4909]: I1002 19:56:49.756548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerStarted","Data":"0f5d4a75a8c85f0af65e327ec4f2290d3b59224d6a7d6feace9a7919e6e3546d"} Oct 02 19:56:49 crc kubenswrapper[4909]: I1002 19:56:49.759407 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 19:56:50 crc kubenswrapper[4909]: I1002 19:56:50.776455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerStarted","Data":"effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9"} Oct 02 19:56:51 crc kubenswrapper[4909]: I1002 19:56:51.790194 4909 generic.go:334] "Generic (PLEG): container finished" podID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerID="effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9" exitCode=0 Oct 02 19:56:51 crc kubenswrapper[4909]: I1002 19:56:51.790442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerDied","Data":"effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9"} Oct 02 19:56:52 crc kubenswrapper[4909]: I1002 19:56:52.806695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerStarted","Data":"4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6"} Oct 02 19:56:52 crc kubenswrapper[4909]: I1002 19:56:52.843743 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-26xcw" podStartSLOduration=2.350925958 podStartE2EDuration="4.843701635s" podCreationTimestamp="2025-10-02 19:56:48 +0000 UTC" firstStartedPulling="2025-10-02 19:56:49.758781114 +0000 UTC m=+5930.946277013" lastFinishedPulling="2025-10-02 19:56:52.251556801 +0000 UTC m=+5933.439052690" observedRunningTime="2025-10-02 19:56:52.833371512 +0000 UTC m=+5934.020867371" watchObservedRunningTime="2025-10-02 19:56:52.843701635 +0000 UTC m=+5934.031197494" Oct 02 19:56:54 crc kubenswrapper[4909]: I1002 19:56:54.609807 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:56:54 crc kubenswrapper[4909]: E1002 19:56:54.610501 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.395946 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8sphs"] Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.404350 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.413301 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sphs"] Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.501534 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-catalog-content\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.501595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87n7\" (UniqueName: \"kubernetes.io/projected/278b01b7-ea8f-4c05-ac55-3999222d33ca-kube-api-access-j87n7\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.501747 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-utilities\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.603477 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-utilities\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.603632 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-catalog-content\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.603667 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j87n7\" (UniqueName: \"kubernetes.io/projected/278b01b7-ea8f-4c05-ac55-3999222d33ca-kube-api-access-j87n7\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.603950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-utilities\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.604330 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-catalog-content\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.644622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87n7\" (UniqueName: \"kubernetes.io/projected/278b01b7-ea8f-4c05-ac55-3999222d33ca-kube-api-access-j87n7\") pod \"redhat-marketplace-8sphs\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:57 crc kubenswrapper[4909]: I1002 19:56:57.749231 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.227836 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sphs"] Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.488793 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.489210 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.534768 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.898294 4909 generic.go:334] "Generic (PLEG): container finished" podID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerID="6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d" exitCode=0 Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.898394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerDied","Data":"6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d"} Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.898633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerStarted","Data":"6f377bda6e533879412b0b9d4f75906927f1f6b00e1b3752b1a18b790a229358"} Oct 02 19:56:58 crc kubenswrapper[4909]: I1002 19:56:58.977969 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:56:59 crc kubenswrapper[4909]: I1002 19:56:59.913954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerStarted","Data":"77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d"} Oct 02 19:57:00 crc kubenswrapper[4909]: I1002 19:57:00.928986 4909 generic.go:334] "Generic (PLEG): container finished" podID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerID="77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d" exitCode=0 Oct 02 19:57:00 crc kubenswrapper[4909]: I1002 19:57:00.929058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerDied","Data":"77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d"} Oct 02 19:57:00 crc kubenswrapper[4909]: I1002 19:57:00.968140 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-26xcw"] Oct 02 19:57:00 crc kubenswrapper[4909]: I1002 19:57:00.968371 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-26xcw" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="registry-server" containerID="cri-o://4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6" gracePeriod=2 Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.532230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.607672 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-utilities\") pod \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.607955 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx47d\" (UniqueName: \"kubernetes.io/projected/d71b0b2b-e263-44b0-9d27-05f0757dfd45-kube-api-access-bx47d\") pod \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.608246 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-catalog-content\") pod \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\" (UID: \"d71b0b2b-e263-44b0-9d27-05f0757dfd45\") " Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.609529 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-utilities" (OuterVolumeSpecName: "utilities") pod "d71b0b2b-e263-44b0-9d27-05f0757dfd45" (UID: "d71b0b2b-e263-44b0-9d27-05f0757dfd45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.623830 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71b0b2b-e263-44b0-9d27-05f0757dfd45-kube-api-access-bx47d" (OuterVolumeSpecName: "kube-api-access-bx47d") pod "d71b0b2b-e263-44b0-9d27-05f0757dfd45" (UID: "d71b0b2b-e263-44b0-9d27-05f0757dfd45"). InnerVolumeSpecName "kube-api-access-bx47d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.679455 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d71b0b2b-e263-44b0-9d27-05f0757dfd45" (UID: "d71b0b2b-e263-44b0-9d27-05f0757dfd45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.713731 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.713843 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx47d\" (UniqueName: \"kubernetes.io/projected/d71b0b2b-e263-44b0-9d27-05f0757dfd45-kube-api-access-bx47d\") on node \"crc\" DevicePath \"\"" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.713893 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71b0b2b-e263-44b0-9d27-05f0757dfd45-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.944816 4909 generic.go:334] "Generic (PLEG): container finished" podID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerID="4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6" exitCode=0 Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.944933 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-26xcw" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.946368 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerDied","Data":"4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6"} Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.946505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-26xcw" event={"ID":"d71b0b2b-e263-44b0-9d27-05f0757dfd45","Type":"ContainerDied","Data":"0f5d4a75a8c85f0af65e327ec4f2290d3b59224d6a7d6feace9a7919e6e3546d"} Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.946614 4909 scope.go:117] "RemoveContainer" containerID="4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.952637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerStarted","Data":"f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131"} Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.980760 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8sphs" podStartSLOduration=2.494021948 podStartE2EDuration="4.980739906s" podCreationTimestamp="2025-10-02 19:56:57 +0000 UTC" firstStartedPulling="2025-10-02 19:56:58.901352928 +0000 UTC m=+5940.088848787" lastFinishedPulling="2025-10-02 19:57:01.388070886 +0000 UTC m=+5942.575566745" observedRunningTime="2025-10-02 19:57:01.972161979 +0000 UTC m=+5943.159657868" watchObservedRunningTime="2025-10-02 19:57:01.980739906 +0000 UTC m=+5943.168235765" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.981065 4909 scope.go:117] "RemoveContainer" containerID="effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9" Oct 02 19:57:01 crc kubenswrapper[4909]: I1002 19:57:01.997994 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-26xcw"] Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.022708 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-26xcw"] Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.029381 4909 scope.go:117] "RemoveContainer" containerID="96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42" Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.068565 4909 scope.go:117] "RemoveContainer" containerID="4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6" Oct 02 19:57:02 crc kubenswrapper[4909]: E1002 19:57:02.069097 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6\": container with ID starting with 4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6 not found: ID does not exist" containerID="4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6" Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.069135 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6"} err="failed to get container status \"4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6\": rpc error: code = NotFound desc = could not find container \"4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6\": container with ID starting with 4a235b4161a771995965c158d211d6523b098b7e0cae8583bc478a6ba43d2bf6 not found: ID does not exist" Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.069158 4909 scope.go:117] "RemoveContainer" containerID="effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9" Oct 02 19:57:02 crc kubenswrapper[4909]: E1002 19:57:02.069454 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9\": container with ID starting with effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9 not found: ID does not exist" containerID="effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9" Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.069475 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9"} err="failed to get container status \"effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9\": rpc error: code = NotFound desc = could not find container \"effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9\": container with ID starting with effb8f7c2549dda2eff0e291c9b79533f41180e569d214ec8eaa0dc99b75a6e9 not found: ID does not exist" Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.069492 4909 scope.go:117] "RemoveContainer" containerID="96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42" Oct 02 19:57:02 crc kubenswrapper[4909]: E1002 19:57:02.069785 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42\": container with ID starting with 96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42 not found: ID does not exist" containerID="96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42" Oct 02 19:57:02 crc kubenswrapper[4909]: I1002 19:57:02.069807 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42"} err="failed to get container status \"96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42\": rpc error: code = NotFound desc = could not find container \"96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42\": container with ID starting with 96c00787dc4ae509a863dca48311b3deb49ea22070b48c727d20fae9dd9d3d42 not found: ID does not exist" Oct 02 19:57:03 crc kubenswrapper[4909]: I1002 19:57:03.630783 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" path="/var/lib/kubelet/pods/d71b0b2b-e263-44b0-9d27-05f0757dfd45/volumes" Oct 02 19:57:07 crc kubenswrapper[4909]: I1002 19:57:07.609468 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:57:07 crc kubenswrapper[4909]: E1002 19:57:07.610658 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:57:07 crc kubenswrapper[4909]: I1002 19:57:07.749896 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:57:07 crc kubenswrapper[4909]: I1002 19:57:07.749964 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:57:07 crc kubenswrapper[4909]: I1002 19:57:07.827234 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:57:08 crc kubenswrapper[4909]: I1002 19:57:08.093723 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:57:08 crc kubenswrapper[4909]: I1002 19:57:08.143600 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sphs"] Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.065068 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8sphs" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="registry-server" containerID="cri-o://f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131" gracePeriod=2 Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.613255 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.735751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-catalog-content\") pod \"278b01b7-ea8f-4c05-ac55-3999222d33ca\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.735901 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j87n7\" (UniqueName: \"kubernetes.io/projected/278b01b7-ea8f-4c05-ac55-3999222d33ca-kube-api-access-j87n7\") pod \"278b01b7-ea8f-4c05-ac55-3999222d33ca\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.735935 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-utilities\") pod \"278b01b7-ea8f-4c05-ac55-3999222d33ca\" (UID: \"278b01b7-ea8f-4c05-ac55-3999222d33ca\") " Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.738383 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-utilities" (OuterVolumeSpecName: "utilities") pod "278b01b7-ea8f-4c05-ac55-3999222d33ca" (UID: "278b01b7-ea8f-4c05-ac55-3999222d33ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.741819 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278b01b7-ea8f-4c05-ac55-3999222d33ca-kube-api-access-j87n7" (OuterVolumeSpecName: "kube-api-access-j87n7") pod "278b01b7-ea8f-4c05-ac55-3999222d33ca" (UID: "278b01b7-ea8f-4c05-ac55-3999222d33ca"). InnerVolumeSpecName "kube-api-access-j87n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.751358 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "278b01b7-ea8f-4c05-ac55-3999222d33ca" (UID: "278b01b7-ea8f-4c05-ac55-3999222d33ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.839647 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.839887 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278b01b7-ea8f-4c05-ac55-3999222d33ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 19:57:10 crc kubenswrapper[4909]: I1002 19:57:10.839903 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j87n7\" (UniqueName: \"kubernetes.io/projected/278b01b7-ea8f-4c05-ac55-3999222d33ca-kube-api-access-j87n7\") on node \"crc\" DevicePath \"\"" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.077855 4909 generic.go:334] "Generic (PLEG): container finished" podID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerID="f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131" exitCode=0 Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.077900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerDied","Data":"f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131"} Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.077933 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sphs" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.077951 4909 scope.go:117] "RemoveContainer" containerID="f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.077936 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sphs" event={"ID":"278b01b7-ea8f-4c05-ac55-3999222d33ca","Type":"ContainerDied","Data":"6f377bda6e533879412b0b9d4f75906927f1f6b00e1b3752b1a18b790a229358"} Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.117776 4909 scope.go:117] "RemoveContainer" containerID="77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.120079 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sphs"] Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.131016 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sphs"] Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.152237 4909 scope.go:117] "RemoveContainer" containerID="6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.206411 4909 scope.go:117] "RemoveContainer" containerID="f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131" Oct 02 19:57:11 crc kubenswrapper[4909]: E1002 19:57:11.207377 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131\": container with ID starting with f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131 not found: ID does not exist" containerID="f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.207413 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131"} err="failed to get container status \"f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131\": rpc error: code = NotFound desc = could not find container \"f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131\": container with ID starting with f0737b9f13b949d1b179c457b538d0defc5e0b192d44f6fe48dc48c60c47c131 not found: ID does not exist" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.207442 4909 scope.go:117] "RemoveContainer" containerID="77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d" Oct 02 19:57:11 crc kubenswrapper[4909]: E1002 19:57:11.207679 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d\": container with ID starting with 77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d not found: ID does not exist" containerID="77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.207711 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d"} err="failed to get container status \"77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d\": rpc error: code = NotFound desc = could not find container \"77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d\": container with ID starting with 77181869b9a7ee683637f397c8e93e6f8a5112b1d59c6935caaa036ccd753e8d not found: ID does not exist" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.207729 4909 scope.go:117] "RemoveContainer" containerID="6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d" Oct 02 19:57:11 crc kubenswrapper[4909]: E1002 19:57:11.207949 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d\": container with ID starting with 6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d not found: ID does not exist" containerID="6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.207979 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d"} err="failed to get container status \"6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d\": rpc error: code = NotFound desc = could not find container \"6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d\": container with ID starting with 6d0d77f100e8265e5561ab6e489bbb84eb54a04241291d73c759571936469b4d not found: ID does not exist" Oct 02 19:57:11 crc kubenswrapper[4909]: I1002 19:57:11.651841 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" path="/var/lib/kubelet/pods/278b01b7-ea8f-4c05-ac55-3999222d33ca/volumes" Oct 02 19:57:18 crc kubenswrapper[4909]: I1002 19:57:18.608665 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:57:18 crc kubenswrapper[4909]: E1002 19:57:18.609199 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:57:32 crc kubenswrapper[4909]: I1002 19:57:32.609398 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:57:32 crc kubenswrapper[4909]: E1002 19:57:32.610288 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:57:47 crc kubenswrapper[4909]: I1002 19:57:47.608301 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:57:47 crc kubenswrapper[4909]: E1002 19:57:47.609327 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:58:01 crc kubenswrapper[4909]: I1002 19:58:01.608281 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:58:01 crc kubenswrapper[4909]: E1002 19:58:01.610976 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:58:13 crc kubenswrapper[4909]: I1002 19:58:13.608547 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:58:13 crc kubenswrapper[4909]: E1002 19:58:13.609242 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:58:28 crc kubenswrapper[4909]: I1002 19:58:28.609695 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:58:28 crc kubenswrapper[4909]: E1002 19:58:28.610838 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:58:41 crc kubenswrapper[4909]: I1002 19:58:41.608665 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:58:41 crc kubenswrapper[4909]: E1002 19:58:41.609555 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:58:54 crc kubenswrapper[4909]: I1002 19:58:54.608723 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:58:54 crc kubenswrapper[4909]: E1002 19:58:54.609586 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:59:07 crc kubenswrapper[4909]: I1002 19:59:07.611281 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:59:07 crc kubenswrapper[4909]: E1002 19:59:07.612669 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:59:18 crc kubenswrapper[4909]: E1002 19:59:18.058208 4909 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:59162->38.102.83.129:45429: write tcp 38.102.83.129:59162->38.102.83.129:45429: write: broken pipe Oct 02 19:59:19 crc kubenswrapper[4909]: I1002 19:59:19.617083 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:59:19 crc kubenswrapper[4909]: E1002 19:59:19.617615 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 19:59:33 crc kubenswrapper[4909]: I1002 19:59:33.615382 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 19:59:34 crc kubenswrapper[4909]: I1002 19:59:34.695253 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"111acd42ee615856c8ebd91df6ae1d9f6ea260721124bb127103fcc161ef14db"} Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.182185 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9"] Oct 02 20:00:00 crc kubenswrapper[4909]: E1002 20:00:00.183332 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="registry-server" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183348 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="registry-server" Oct 02 20:00:00 crc kubenswrapper[4909]: E1002 20:00:00.183373 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="extract-utilities" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183384 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="extract-utilities" Oct 02 20:00:00 crc kubenswrapper[4909]: E1002 20:00:00.183420 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="registry-server" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183429 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="registry-server" Oct 02 20:00:00 crc kubenswrapper[4909]: E1002 20:00:00.183442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="extract-utilities" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183450 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="extract-utilities" Oct 02 20:00:00 crc kubenswrapper[4909]: E1002 20:00:00.183463 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="extract-content" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183471 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="extract-content" Oct 02 20:00:00 crc kubenswrapper[4909]: E1002 20:00:00.183488 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="extract-content" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183495 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="extract-content" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183800 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="278b01b7-ea8f-4c05-ac55-3999222d33ca" containerName="registry-server" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.183817 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71b0b2b-e263-44b0-9d27-05f0757dfd45" containerName="registry-server" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.184769 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.196004 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9"] Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.216088 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.217456 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.376768 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f06a6ec9-167c-4f75-8171-19c3a104971e-config-volume\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.376822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxs77\" (UniqueName: \"kubernetes.io/projected/f06a6ec9-167c-4f75-8171-19c3a104971e-kube-api-access-jxs77\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.377432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f06a6ec9-167c-4f75-8171-19c3a104971e-secret-volume\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.479307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f06a6ec9-167c-4f75-8171-19c3a104971e-secret-volume\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.479422 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f06a6ec9-167c-4f75-8171-19c3a104971e-config-volume\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.479446 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxs77\" (UniqueName: \"kubernetes.io/projected/f06a6ec9-167c-4f75-8171-19c3a104971e-kube-api-access-jxs77\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.482870 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f06a6ec9-167c-4f75-8171-19c3a104971e-config-volume\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.504777 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f06a6ec9-167c-4f75-8171-19c3a104971e-secret-volume\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.543960 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxs77\" (UniqueName: \"kubernetes.io/projected/f06a6ec9-167c-4f75-8171-19c3a104971e-kube-api-access-jxs77\") pod \"collect-profiles-29323920-xkqn9\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:00 crc kubenswrapper[4909]: I1002 20:00:00.558173 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:01 crc kubenswrapper[4909]: I1002 20:00:01.120973 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9"] Oct 02 20:00:02 crc kubenswrapper[4909]: I1002 20:00:02.007812 4909 generic.go:334] "Generic (PLEG): container finished" podID="f06a6ec9-167c-4f75-8171-19c3a104971e" containerID="de76e1d5c9576aca4e51a8ac7435f7db984dd102133762fa9848dd0b2aa64b55" exitCode=0 Oct 02 20:00:02 crc kubenswrapper[4909]: I1002 20:00:02.008455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" event={"ID":"f06a6ec9-167c-4f75-8171-19c3a104971e","Type":"ContainerDied","Data":"de76e1d5c9576aca4e51a8ac7435f7db984dd102133762fa9848dd0b2aa64b55"} Oct 02 20:00:02 crc kubenswrapper[4909]: I1002 20:00:02.008498 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" event={"ID":"f06a6ec9-167c-4f75-8171-19c3a104971e","Type":"ContainerStarted","Data":"8f7d10972b9c8558462a7d7c0fe1314d704e3e51f83bf36c4fc4519eb9c45676"} Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.430959 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.486332 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxs77\" (UniqueName: \"kubernetes.io/projected/f06a6ec9-167c-4f75-8171-19c3a104971e-kube-api-access-jxs77\") pod \"f06a6ec9-167c-4f75-8171-19c3a104971e\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.486660 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f06a6ec9-167c-4f75-8171-19c3a104971e-secret-volume\") pod \"f06a6ec9-167c-4f75-8171-19c3a104971e\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.486691 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f06a6ec9-167c-4f75-8171-19c3a104971e-config-volume\") pod \"f06a6ec9-167c-4f75-8171-19c3a104971e\" (UID: \"f06a6ec9-167c-4f75-8171-19c3a104971e\") " Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.488589 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f06a6ec9-167c-4f75-8171-19c3a104971e-config-volume" (OuterVolumeSpecName: "config-volume") pod "f06a6ec9-167c-4f75-8171-19c3a104971e" (UID: "f06a6ec9-167c-4f75-8171-19c3a104971e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.496364 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06a6ec9-167c-4f75-8171-19c3a104971e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f06a6ec9-167c-4f75-8171-19c3a104971e" (UID: "f06a6ec9-167c-4f75-8171-19c3a104971e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.497117 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06a6ec9-167c-4f75-8171-19c3a104971e-kube-api-access-jxs77" (OuterVolumeSpecName: "kube-api-access-jxs77") pod "f06a6ec9-167c-4f75-8171-19c3a104971e" (UID: "f06a6ec9-167c-4f75-8171-19c3a104971e"). InnerVolumeSpecName "kube-api-access-jxs77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.590045 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxs77\" (UniqueName: \"kubernetes.io/projected/f06a6ec9-167c-4f75-8171-19c3a104971e-kube-api-access-jxs77\") on node \"crc\" DevicePath \"\"" Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.590085 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f06a6ec9-167c-4f75-8171-19c3a104971e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 20:00:03 crc kubenswrapper[4909]: I1002 20:00:03.590101 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f06a6ec9-167c-4f75-8171-19c3a104971e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 20:00:04 crc kubenswrapper[4909]: I1002 20:00:04.036175 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" event={"ID":"f06a6ec9-167c-4f75-8171-19c3a104971e","Type":"ContainerDied","Data":"8f7d10972b9c8558462a7d7c0fe1314d704e3e51f83bf36c4fc4519eb9c45676"} Oct 02 20:00:04 crc kubenswrapper[4909]: I1002 20:00:04.036533 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f7d10972b9c8558462a7d7c0fe1314d704e3e51f83bf36c4fc4519eb9c45676" Oct 02 20:00:04 crc kubenswrapper[4909]: I1002 20:00:04.036223 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323920-xkqn9" Oct 02 20:00:04 crc kubenswrapper[4909]: I1002 20:00:04.511546 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r"] Oct 02 20:00:04 crc kubenswrapper[4909]: I1002 20:00:04.520693 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323875-m6x8r"] Oct 02 20:00:05 crc kubenswrapper[4909]: I1002 20:00:05.625513 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956f29e1-a716-47ea-b00a-4827ce24cacf" path="/var/lib/kubelet/pods/956f29e1-a716-47ea-b00a-4827ce24cacf/volumes" Oct 02 20:00:27 crc kubenswrapper[4909]: I1002 20:00:27.187119 4909 scope.go:117] "RemoveContainer" containerID="62b922570371ced9f422dfa4e1af422fcedd70badf9b7734e27dca00cd0ef652" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.156326 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323921-smc4d"] Oct 02 20:01:00 crc kubenswrapper[4909]: E1002 20:01:00.157499 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06a6ec9-167c-4f75-8171-19c3a104971e" containerName="collect-profiles" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.157528 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06a6ec9-167c-4f75-8171-19c3a104971e" containerName="collect-profiles" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.157838 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06a6ec9-167c-4f75-8171-19c3a104971e" containerName="collect-profiles" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.159004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.168817 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323921-smc4d"] Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.183000 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-fernet-keys\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.184589 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxc4q\" (UniqueName: \"kubernetes.io/projected/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-kube-api-access-gxc4q\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.184952 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-config-data\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.185200 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-combined-ca-bundle\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.287330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-fernet-keys\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.287461 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxc4q\" (UniqueName: \"kubernetes.io/projected/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-kube-api-access-gxc4q\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.287550 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-config-data\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.287572 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-combined-ca-bundle\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.293477 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-fernet-keys\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.293500 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-combined-ca-bundle\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.294353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-config-data\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.313114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxc4q\" (UniqueName: \"kubernetes.io/projected/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-kube-api-access-gxc4q\") pod \"keystone-cron-29323921-smc4d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.492206 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:00 crc kubenswrapper[4909]: I1002 20:01:00.958649 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323921-smc4d"] Oct 02 20:01:01 crc kubenswrapper[4909]: I1002 20:01:01.722605 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323921-smc4d" event={"ID":"d5fd446f-b5eb-4efa-ae6f-877c30b6321d","Type":"ContainerStarted","Data":"b2d767d26597456418ad0b09333da91643d78638913d8942a110200dad7be9c4"} Oct 02 20:01:01 crc kubenswrapper[4909]: I1002 20:01:01.723135 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323921-smc4d" event={"ID":"d5fd446f-b5eb-4efa-ae6f-877c30b6321d","Type":"ContainerStarted","Data":"4d8703e2328422feb21c84780bf952025dce1a2963c5659be754c8dac52555e8"} Oct 02 20:01:01 crc kubenswrapper[4909]: I1002 20:01:01.747014 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323921-smc4d" podStartSLOduration=1.746989552 podStartE2EDuration="1.746989552s" podCreationTimestamp="2025-10-02 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 20:01:01.737815786 +0000 UTC m=+6182.925311685" watchObservedRunningTime="2025-10-02 20:01:01.746989552 +0000 UTC m=+6182.934485431" Oct 02 20:01:04 crc kubenswrapper[4909]: I1002 20:01:04.751875 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5fd446f-b5eb-4efa-ae6f-877c30b6321d" containerID="b2d767d26597456418ad0b09333da91643d78638913d8942a110200dad7be9c4" exitCode=0 Oct 02 20:01:04 crc kubenswrapper[4909]: I1002 20:01:04.751977 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323921-smc4d" event={"ID":"d5fd446f-b5eb-4efa-ae6f-877c30b6321d","Type":"ContainerDied","Data":"b2d767d26597456418ad0b09333da91643d78638913d8942a110200dad7be9c4"} Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.234884 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.356605 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxc4q\" (UniqueName: \"kubernetes.io/projected/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-kube-api-access-gxc4q\") pod \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.356675 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-combined-ca-bundle\") pod \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.356732 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-fernet-keys\") pod \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.356825 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-config-data\") pod \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\" (UID: \"d5fd446f-b5eb-4efa-ae6f-877c30b6321d\") " Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.365944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d5fd446f-b5eb-4efa-ae6f-877c30b6321d" (UID: "d5fd446f-b5eb-4efa-ae6f-877c30b6321d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.373777 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-kube-api-access-gxc4q" (OuterVolumeSpecName: "kube-api-access-gxc4q") pod "d5fd446f-b5eb-4efa-ae6f-877c30b6321d" (UID: "d5fd446f-b5eb-4efa-ae6f-877c30b6321d"). InnerVolumeSpecName "kube-api-access-gxc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.413393 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5fd446f-b5eb-4efa-ae6f-877c30b6321d" (UID: "d5fd446f-b5eb-4efa-ae6f-877c30b6321d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.426942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-config-data" (OuterVolumeSpecName: "config-data") pod "d5fd446f-b5eb-4efa-ae6f-877c30b6321d" (UID: "d5fd446f-b5eb-4efa-ae6f-877c30b6321d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.461539 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxc4q\" (UniqueName: \"kubernetes.io/projected/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-kube-api-access-gxc4q\") on node \"crc\" DevicePath \"\"" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.461981 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.462246 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.462401 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fd446f-b5eb-4efa-ae6f-877c30b6321d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.781240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323921-smc4d" event={"ID":"d5fd446f-b5eb-4efa-ae6f-877c30b6321d","Type":"ContainerDied","Data":"4d8703e2328422feb21c84780bf952025dce1a2963c5659be754c8dac52555e8"} Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.800234 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d8703e2328422feb21c84780bf952025dce1a2963c5659be754c8dac52555e8" Oct 02 20:01:06 crc kubenswrapper[4909]: I1002 20:01:06.781312 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323921-smc4d" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.730381 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k72c5"] Oct 02 20:01:41 crc kubenswrapper[4909]: E1002 20:01:41.731602 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fd446f-b5eb-4efa-ae6f-877c30b6321d" containerName="keystone-cron" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.731620 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fd446f-b5eb-4efa-ae6f-877c30b6321d" containerName="keystone-cron" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.731946 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5fd446f-b5eb-4efa-ae6f-877c30b6321d" containerName="keystone-cron" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.734019 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.744913 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k72c5"] Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.813206 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-utilities\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.813298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-catalog-content\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.813352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcdt\" (UniqueName: \"kubernetes.io/projected/c3a08d35-5f43-4495-bfb6-b55e3390f264-kube-api-access-ztcdt\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.915158 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-catalog-content\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.915268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcdt\" (UniqueName: \"kubernetes.io/projected/c3a08d35-5f43-4495-bfb6-b55e3390f264-kube-api-access-ztcdt\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.915473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-utilities\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.916128 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-utilities\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.916421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-catalog-content\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:41 crc kubenswrapper[4909]: I1002 20:01:41.941765 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcdt\" (UniqueName: \"kubernetes.io/projected/c3a08d35-5f43-4495-bfb6-b55e3390f264-kube-api-access-ztcdt\") pod \"redhat-operators-k72c5\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:42 crc kubenswrapper[4909]: I1002 20:01:42.060120 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:42 crc kubenswrapper[4909]: I1002 20:01:42.615175 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k72c5"] Oct 02 20:01:43 crc kubenswrapper[4909]: I1002 20:01:43.197196 4909 generic.go:334] "Generic (PLEG): container finished" podID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerID="23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942" exitCode=0 Oct 02 20:01:43 crc kubenswrapper[4909]: I1002 20:01:43.197274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerDied","Data":"23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942"} Oct 02 20:01:43 crc kubenswrapper[4909]: I1002 20:01:43.197528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerStarted","Data":"7bb283a656aee60d2e19e9746fbd26d4cbddaba190ff6a37f2701136377cca43"} Oct 02 20:01:44 crc kubenswrapper[4909]: I1002 20:01:44.221563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerStarted","Data":"1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45"} Oct 02 20:01:49 crc kubenswrapper[4909]: I1002 20:01:49.292553 4909 generic.go:334] "Generic (PLEG): container finished" podID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerID="1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45" exitCode=0 Oct 02 20:01:49 crc kubenswrapper[4909]: I1002 20:01:49.292620 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerDied","Data":"1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45"} Oct 02 20:01:50 crc kubenswrapper[4909]: I1002 20:01:50.309003 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerStarted","Data":"2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711"} Oct 02 20:01:50 crc kubenswrapper[4909]: I1002 20:01:50.333755 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k72c5" podStartSLOduration=2.751456958 podStartE2EDuration="9.33373135s" podCreationTimestamp="2025-10-02 20:01:41 +0000 UTC" firstStartedPulling="2025-10-02 20:01:43.19956982 +0000 UTC m=+6224.387065679" lastFinishedPulling="2025-10-02 20:01:49.781844212 +0000 UTC m=+6230.969340071" observedRunningTime="2025-10-02 20:01:50.331627105 +0000 UTC m=+6231.519122964" watchObservedRunningTime="2025-10-02 20:01:50.33373135 +0000 UTC m=+6231.521227219" Oct 02 20:01:52 crc kubenswrapper[4909]: I1002 20:01:52.060912 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:52 crc kubenswrapper[4909]: I1002 20:01:52.061400 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:01:53 crc kubenswrapper[4909]: I1002 20:01:53.054942 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:01:53 crc kubenswrapper[4909]: I1002 20:01:53.055373 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:01:53 crc kubenswrapper[4909]: I1002 20:01:53.138301 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k72c5" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="registry-server" probeResult="failure" output=< Oct 02 20:01:53 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:01:53 crc kubenswrapper[4909]: > Oct 02 20:02:02 crc kubenswrapper[4909]: I1002 20:02:02.152803 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:02:02 crc kubenswrapper[4909]: I1002 20:02:02.213912 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:02:02 crc kubenswrapper[4909]: I1002 20:02:02.408423 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k72c5"] Oct 02 20:02:03 crc kubenswrapper[4909]: I1002 20:02:03.482491 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k72c5" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="registry-server" containerID="cri-o://2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711" gracePeriod=2 Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.108905 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.203459 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-catalog-content\") pod \"c3a08d35-5f43-4495-bfb6-b55e3390f264\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.203599 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-utilities\") pod \"c3a08d35-5f43-4495-bfb6-b55e3390f264\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.203823 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztcdt\" (UniqueName: \"kubernetes.io/projected/c3a08d35-5f43-4495-bfb6-b55e3390f264-kube-api-access-ztcdt\") pod \"c3a08d35-5f43-4495-bfb6-b55e3390f264\" (UID: \"c3a08d35-5f43-4495-bfb6-b55e3390f264\") " Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.204605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-utilities" (OuterVolumeSpecName: "utilities") pod "c3a08d35-5f43-4495-bfb6-b55e3390f264" (UID: "c3a08d35-5f43-4495-bfb6-b55e3390f264"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.205372 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.216751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a08d35-5f43-4495-bfb6-b55e3390f264-kube-api-access-ztcdt" (OuterVolumeSpecName: "kube-api-access-ztcdt") pod "c3a08d35-5f43-4495-bfb6-b55e3390f264" (UID: "c3a08d35-5f43-4495-bfb6-b55e3390f264"). InnerVolumeSpecName "kube-api-access-ztcdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.307540 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztcdt\" (UniqueName: \"kubernetes.io/projected/c3a08d35-5f43-4495-bfb6-b55e3390f264-kube-api-access-ztcdt\") on node \"crc\" DevicePath \"\"" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.340864 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3a08d35-5f43-4495-bfb6-b55e3390f264" (UID: "c3a08d35-5f43-4495-bfb6-b55e3390f264"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.412430 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a08d35-5f43-4495-bfb6-b55e3390f264-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.511894 4909 generic.go:334] "Generic (PLEG): container finished" podID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerID="2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711" exitCode=0 Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.512008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerDied","Data":"2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711"} Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.512071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k72c5" event={"ID":"c3a08d35-5f43-4495-bfb6-b55e3390f264","Type":"ContainerDied","Data":"7bb283a656aee60d2e19e9746fbd26d4cbddaba190ff6a37f2701136377cca43"} Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.512104 4909 scope.go:117] "RemoveContainer" containerID="2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.519207 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k72c5" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.563200 4909 scope.go:117] "RemoveContainer" containerID="1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.571718 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k72c5"] Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.590259 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k72c5"] Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.603998 4909 scope.go:117] "RemoveContainer" containerID="23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.648091 4909 scope.go:117] "RemoveContainer" containerID="2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711" Oct 02 20:02:04 crc kubenswrapper[4909]: E1002 20:02:04.648607 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711\": container with ID starting with 2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711 not found: ID does not exist" containerID="2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.648638 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711"} err="failed to get container status \"2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711\": rpc error: code = NotFound desc = could not find container \"2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711\": container with ID starting with 2aa0f71c5dfd159bd8bd72f274c61f3cf1d0bb8cac069d553b60351a70520711 not found: ID does not exist" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.648658 4909 scope.go:117] "RemoveContainer" containerID="1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45" Oct 02 20:02:04 crc kubenswrapper[4909]: E1002 20:02:04.649127 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45\": container with ID starting with 1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45 not found: ID does not exist" containerID="1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.649170 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45"} err="failed to get container status \"1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45\": rpc error: code = NotFound desc = could not find container \"1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45\": container with ID starting with 1cc54157eb94f57542f3938120f729c7b07094d1a68f353c3400235973e87a45 not found: ID does not exist" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.649199 4909 scope.go:117] "RemoveContainer" containerID="23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942" Oct 02 20:02:04 crc kubenswrapper[4909]: E1002 20:02:04.649650 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942\": container with ID starting with 23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942 not found: ID does not exist" containerID="23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942" Oct 02 20:02:04 crc kubenswrapper[4909]: I1002 20:02:04.649695 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942"} err="failed to get container status \"23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942\": rpc error: code = NotFound desc = could not find container \"23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942\": container with ID starting with 23361b971390bf0e5dec32797cafb07005580b9ef6f1eae1519fdeb00c9f1942 not found: ID does not exist" Oct 02 20:02:05 crc kubenswrapper[4909]: I1002 20:02:05.623844 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" path="/var/lib/kubelet/pods/c3a08d35-5f43-4495-bfb6-b55e3390f264/volumes" Oct 02 20:02:23 crc kubenswrapper[4909]: I1002 20:02:23.055262 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:02:23 crc kubenswrapper[4909]: I1002 20:02:23.055863 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:02:53 crc kubenswrapper[4909]: I1002 20:02:53.054757 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:02:53 crc kubenswrapper[4909]: I1002 20:02:53.055408 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:02:53 crc kubenswrapper[4909]: I1002 20:02:53.055453 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:02:53 crc kubenswrapper[4909]: I1002 20:02:53.056427 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"111acd42ee615856c8ebd91df6ae1d9f6ea260721124bb127103fcc161ef14db"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:02:53 crc kubenswrapper[4909]: I1002 20:02:53.056483 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://111acd42ee615856c8ebd91df6ae1d9f6ea260721124bb127103fcc161ef14db" gracePeriod=600 Oct 02 20:02:54 crc kubenswrapper[4909]: I1002 20:02:54.086903 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="111acd42ee615856c8ebd91df6ae1d9f6ea260721124bb127103fcc161ef14db" exitCode=0 Oct 02 20:02:54 crc kubenswrapper[4909]: I1002 20:02:54.087043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"111acd42ee615856c8ebd91df6ae1d9f6ea260721124bb127103fcc161ef14db"} Oct 02 20:02:54 crc kubenswrapper[4909]: I1002 20:02:54.087574 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58"} Oct 02 20:02:54 crc kubenswrapper[4909]: I1002 20:02:54.087617 4909 scope.go:117] "RemoveContainer" containerID="e1bd4801e9b67e776098f1f953c85b96ee1180bb563ea9ee6ba512412ea67efc" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.955850 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2t8bz"] Oct 02 20:03:02 crc kubenswrapper[4909]: E1002 20:03:02.958215 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="extract-utilities" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.958239 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="extract-utilities" Oct 02 20:03:02 crc kubenswrapper[4909]: E1002 20:03:02.958270 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="extract-content" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.958278 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="extract-content" Oct 02 20:03:02 crc kubenswrapper[4909]: E1002 20:03:02.958320 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="registry-server" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.958328 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="registry-server" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.958610 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a08d35-5f43-4495-bfb6-b55e3390f264" containerName="registry-server" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.962935 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:02 crc kubenswrapper[4909]: I1002 20:03:02.980509 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2t8bz"] Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.046840 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7zkk\" (UniqueName: \"kubernetes.io/projected/22d523bb-dd19-4e82-b484-0a27e661dc52-kube-api-access-f7zkk\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.047330 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-utilities\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.047436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-catalog-content\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.149491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-catalog-content\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.149620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7zkk\" (UniqueName: \"kubernetes.io/projected/22d523bb-dd19-4e82-b484-0a27e661dc52-kube-api-access-f7zkk\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.149706 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-utilities\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.150138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-utilities\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.150217 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-catalog-content\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.174811 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7zkk\" (UniqueName: \"kubernetes.io/projected/22d523bb-dd19-4e82-b484-0a27e661dc52-kube-api-access-f7zkk\") pod \"certified-operators-2t8bz\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.311043 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:03 crc kubenswrapper[4909]: I1002 20:03:03.874795 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2t8bz"] Oct 02 20:03:04 crc kubenswrapper[4909]: I1002 20:03:04.204534 4909 generic.go:334] "Generic (PLEG): container finished" podID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerID="c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089" exitCode=0 Oct 02 20:03:04 crc kubenswrapper[4909]: I1002 20:03:04.204631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerDied","Data":"c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089"} Oct 02 20:03:04 crc kubenswrapper[4909]: I1002 20:03:04.204889 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerStarted","Data":"e6422e4a3de37557af3fc9d01312fdf271d522746b8a2f5aa947d03421ad5f27"} Oct 02 20:03:04 crc kubenswrapper[4909]: I1002 20:03:04.206452 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:03:06 crc kubenswrapper[4909]: I1002 20:03:06.234296 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerStarted","Data":"6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e"} Oct 02 20:03:07 crc kubenswrapper[4909]: I1002 20:03:07.246872 4909 generic.go:334] "Generic (PLEG): container finished" podID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerID="6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e" exitCode=0 Oct 02 20:03:07 crc kubenswrapper[4909]: I1002 20:03:07.246972 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerDied","Data":"6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e"} Oct 02 20:03:08 crc kubenswrapper[4909]: I1002 20:03:08.265335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerStarted","Data":"6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76"} Oct 02 20:03:08 crc kubenswrapper[4909]: I1002 20:03:08.285823 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2t8bz" podStartSLOduration=2.823602644 podStartE2EDuration="6.285807883s" podCreationTimestamp="2025-10-02 20:03:02 +0000 UTC" firstStartedPulling="2025-10-02 20:03:04.206064374 +0000 UTC m=+6305.393560243" lastFinishedPulling="2025-10-02 20:03:07.668269583 +0000 UTC m=+6308.855765482" observedRunningTime="2025-10-02 20:03:08.283630476 +0000 UTC m=+6309.471126345" watchObservedRunningTime="2025-10-02 20:03:08.285807883 +0000 UTC m=+6309.473303742" Oct 02 20:03:13 crc kubenswrapper[4909]: I1002 20:03:13.311317 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:13 crc kubenswrapper[4909]: I1002 20:03:13.311887 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:13 crc kubenswrapper[4909]: I1002 20:03:13.385173 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:14 crc kubenswrapper[4909]: I1002 20:03:14.438786 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:14 crc kubenswrapper[4909]: I1002 20:03:14.499423 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2t8bz"] Oct 02 20:03:16 crc kubenswrapper[4909]: I1002 20:03:16.387862 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2t8bz" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="registry-server" containerID="cri-o://6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76" gracePeriod=2 Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.045563 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.175208 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-catalog-content\") pod \"22d523bb-dd19-4e82-b484-0a27e661dc52\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.175343 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-utilities\") pod \"22d523bb-dd19-4e82-b484-0a27e661dc52\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.175594 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7zkk\" (UniqueName: \"kubernetes.io/projected/22d523bb-dd19-4e82-b484-0a27e661dc52-kube-api-access-f7zkk\") pod \"22d523bb-dd19-4e82-b484-0a27e661dc52\" (UID: \"22d523bb-dd19-4e82-b484-0a27e661dc52\") " Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.176301 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-utilities" (OuterVolumeSpecName: "utilities") pod "22d523bb-dd19-4e82-b484-0a27e661dc52" (UID: "22d523bb-dd19-4e82-b484-0a27e661dc52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.183013 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d523bb-dd19-4e82-b484-0a27e661dc52-kube-api-access-f7zkk" (OuterVolumeSpecName: "kube-api-access-f7zkk") pod "22d523bb-dd19-4e82-b484-0a27e661dc52" (UID: "22d523bb-dd19-4e82-b484-0a27e661dc52"). InnerVolumeSpecName "kube-api-access-f7zkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.279384 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7zkk\" (UniqueName: \"kubernetes.io/projected/22d523bb-dd19-4e82-b484-0a27e661dc52-kube-api-access-f7zkk\") on node \"crc\" DevicePath \"\"" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.279773 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.399456 4909 generic.go:334] "Generic (PLEG): container finished" podID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerID="6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76" exitCode=0 Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.399505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerDied","Data":"6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76"} Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.399603 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2t8bz" event={"ID":"22d523bb-dd19-4e82-b484-0a27e661dc52","Type":"ContainerDied","Data":"e6422e4a3de37557af3fc9d01312fdf271d522746b8a2f5aa947d03421ad5f27"} Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.399634 4909 scope.go:117] "RemoveContainer" containerID="6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.400932 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2t8bz" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.426084 4909 scope.go:117] "RemoveContainer" containerID="6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.459462 4909 scope.go:117] "RemoveContainer" containerID="c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.514401 4909 scope.go:117] "RemoveContainer" containerID="6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76" Oct 02 20:03:17 crc kubenswrapper[4909]: E1002 20:03:17.515800 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76\": container with ID starting with 6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76 not found: ID does not exist" containerID="6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.515881 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76"} err="failed to get container status \"6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76\": rpc error: code = NotFound desc = could not find container \"6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76\": container with ID starting with 6563b19b9cda76c43c7b9b7aca21ae2e25c52186be88feae6b5cc003b63c4c76 not found: ID does not exist" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.515923 4909 scope.go:117] "RemoveContainer" containerID="6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e" Oct 02 20:03:17 crc kubenswrapper[4909]: E1002 20:03:17.516835 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e\": container with ID starting with 6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e not found: ID does not exist" containerID="6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.516875 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e"} err="failed to get container status \"6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e\": rpc error: code = NotFound desc = could not find container \"6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e\": container with ID starting with 6a9ef24adf790e58b8334a462cc70933d3d32cf379c20f704fa4575d730c505e not found: ID does not exist" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.516916 4909 scope.go:117] "RemoveContainer" containerID="c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089" Oct 02 20:03:17 crc kubenswrapper[4909]: E1002 20:03:17.517615 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089\": container with ID starting with c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089 not found: ID does not exist" containerID="c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.517655 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089"} err="failed to get container status \"c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089\": rpc error: code = NotFound desc = could not find container \"c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089\": container with ID starting with c4e805bac4168ee61026aa907d6eeb890e35f4a95147b8d702914bcb070eb089 not found: ID does not exist" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.555985 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22d523bb-dd19-4e82-b484-0a27e661dc52" (UID: "22d523bb-dd19-4e82-b484-0a27e661dc52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.588092 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d523bb-dd19-4e82-b484-0a27e661dc52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.731749 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2t8bz"] Oct 02 20:03:17 crc kubenswrapper[4909]: I1002 20:03:17.741492 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2t8bz"] Oct 02 20:03:19 crc kubenswrapper[4909]: I1002 20:03:19.621182 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" path="/var/lib/kubelet/pods/22d523bb-dd19-4e82-b484-0a27e661dc52/volumes" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.356262 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 20:03:21 crc kubenswrapper[4909]: E1002 20:03:21.357318 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="registry-server" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.357334 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="registry-server" Oct 02 20:03:21 crc kubenswrapper[4909]: E1002 20:03:21.357350 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="extract-content" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.357358 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="extract-content" Oct 02 20:03:21 crc kubenswrapper[4909]: E1002 20:03:21.357400 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="extract-utilities" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.357409 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="extract-utilities" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.357699 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d523bb-dd19-4e82-b484-0a27e661dc52" containerName="registry-server" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.358820 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.361075 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.362486 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.364279 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7v74j" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.364334 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.370161 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.481824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.481915 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.481965 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.481998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-config-data\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.482068 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl58p\" (UniqueName: \"kubernetes.io/projected/5c3787cb-4303-4e47-aa85-ba12c768c729-kube-api-access-cl58p\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.482117 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.482146 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.482298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.482356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.584776 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.584833 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.584859 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-config-data\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.584902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl58p\" (UniqueName: \"kubernetes.io/projected/5c3787cb-4303-4e47-aa85-ba12c768c729-kube-api-access-cl58p\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.584936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.584988 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.585125 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.585164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.585194 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.586258 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.586836 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.589251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-config-data\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.589532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.590952 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.591802 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.601045 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.606959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.615943 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl58p\" (UniqueName: \"kubernetes.io/projected/5c3787cb-4303-4e47-aa85-ba12c768c729-kube-api-access-cl58p\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.658408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " pod="openstack/tempest-tests-tempest" Oct 02 20:03:21 crc kubenswrapper[4909]: I1002 20:03:21.688836 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 20:03:22 crc kubenswrapper[4909]: I1002 20:03:22.240130 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 20:03:22 crc kubenswrapper[4909]: I1002 20:03:22.450551 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5c3787cb-4303-4e47-aa85-ba12c768c729","Type":"ContainerStarted","Data":"25547879b4fe41330559432409a5e1281c55f8b3635b6b46db0f49691db87512"} Oct 02 20:04:10 crc kubenswrapper[4909]: E1002 20:04:10.184227 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 02 20:04:10 crc kubenswrapper[4909]: E1002 20:04:10.188826 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cl58p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5c3787cb-4303-4e47-aa85-ba12c768c729): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 20:04:10 crc kubenswrapper[4909]: E1002 20:04:10.190396 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5c3787cb-4303-4e47-aa85-ba12c768c729" Oct 02 20:04:11 crc kubenswrapper[4909]: E1002 20:04:11.097426 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5c3787cb-4303-4e47-aa85-ba12c768c729" Oct 02 20:04:26 crc kubenswrapper[4909]: I1002 20:04:26.011834 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 20:04:29 crc kubenswrapper[4909]: I1002 20:04:29.309174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5c3787cb-4303-4e47-aa85-ba12c768c729","Type":"ContainerStarted","Data":"5baec6969c890c8f56d69560e674e791ba43e6041a6ff0b29a64417fe5585ee1"} Oct 02 20:04:29 crc kubenswrapper[4909]: I1002 20:04:29.333628 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.567468558 podStartE2EDuration="1m9.333608419s" podCreationTimestamp="2025-10-02 20:03:20 +0000 UTC" firstStartedPulling="2025-10-02 20:03:22.242480139 +0000 UTC m=+6323.429975998" lastFinishedPulling="2025-10-02 20:04:26.00861998 +0000 UTC m=+6387.196115859" observedRunningTime="2025-10-02 20:04:29.332101132 +0000 UTC m=+6390.519597041" watchObservedRunningTime="2025-10-02 20:04:29.333608419 +0000 UTC m=+6390.521104278" Oct 02 20:04:53 crc kubenswrapper[4909]: I1002 20:04:53.055194 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:04:53 crc kubenswrapper[4909]: I1002 20:04:53.057694 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:05:23 crc kubenswrapper[4909]: I1002 20:05:23.054684 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:05:23 crc kubenswrapper[4909]: I1002 20:05:23.056447 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.054653 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.055205 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.055251 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.056112 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.056535 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" gracePeriod=600 Oct 02 20:05:53 crc kubenswrapper[4909]: E1002 20:05:53.199114 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.306053 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" exitCode=0 Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.306115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58"} Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.306162 4909 scope.go:117] "RemoveContainer" containerID="111acd42ee615856c8ebd91df6ae1d9f6ea260721124bb127103fcc161ef14db" Oct 02 20:05:53 crc kubenswrapper[4909]: I1002 20:05:53.307385 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:05:53 crc kubenswrapper[4909]: E1002 20:05:53.307845 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:06:06 crc kubenswrapper[4909]: I1002 20:06:06.610706 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:06:06 crc kubenswrapper[4909]: E1002 20:06:06.611433 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:06:19 crc kubenswrapper[4909]: I1002 20:06:19.624365 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:06:19 crc kubenswrapper[4909]: E1002 20:06:19.625199 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:06:30 crc kubenswrapper[4909]: I1002 20:06:30.608579 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:06:30 crc kubenswrapper[4909]: E1002 20:06:30.609393 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:06:45 crc kubenswrapper[4909]: I1002 20:06:45.608388 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:06:45 crc kubenswrapper[4909]: E1002 20:06:45.609333 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:06:56 crc kubenswrapper[4909]: I1002 20:06:56.609071 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:06:56 crc kubenswrapper[4909]: E1002 20:06:56.610406 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:07:07 crc kubenswrapper[4909]: I1002 20:07:07.608781 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:07:07 crc kubenswrapper[4909]: E1002 20:07:07.609615 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:07:21 crc kubenswrapper[4909]: I1002 20:07:21.608839 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:07:21 crc kubenswrapper[4909]: E1002 20:07:21.609815 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:07:32 crc kubenswrapper[4909]: I1002 20:07:32.609420 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:07:32 crc kubenswrapper[4909]: E1002 20:07:32.610263 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:07:36 crc kubenswrapper[4909]: I1002 20:07:36.960352 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djrpz"] Oct 02 20:07:36 crc kubenswrapper[4909]: I1002 20:07:36.965723 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.012094 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrpz"] Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.013226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-utilities\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.013361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-catalog-content\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.013480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ds2\" (UniqueName: \"kubernetes.io/projected/81e5aef7-31fd-48b3-b53c-034995c194fd-kube-api-access-q4ds2\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.115540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-utilities\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.116127 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-utilities\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.116568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-catalog-content\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.116837 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-catalog-content\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.116894 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ds2\" (UniqueName: \"kubernetes.io/projected/81e5aef7-31fd-48b3-b53c-034995c194fd-kube-api-access-q4ds2\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.143548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ds2\" (UniqueName: \"kubernetes.io/projected/81e5aef7-31fd-48b3-b53c-034995c194fd-kube-api-access-q4ds2\") pod \"community-operators-djrpz\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:37 crc kubenswrapper[4909]: I1002 20:07:37.307270 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:38 crc kubenswrapper[4909]: I1002 20:07:38.529956 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrpz"] Oct 02 20:07:39 crc kubenswrapper[4909]: I1002 20:07:39.499560 4909 generic.go:334] "Generic (PLEG): container finished" podID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerID="35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28" exitCode=0 Oct 02 20:07:39 crc kubenswrapper[4909]: I1002 20:07:39.500156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerDied","Data":"35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28"} Oct 02 20:07:39 crc kubenswrapper[4909]: I1002 20:07:39.500199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerStarted","Data":"a406965b8223957cb0f5b927f00ef2cdcfa145f44665bc07aecaf6a608a88a3b"} Oct 02 20:07:41 crc kubenswrapper[4909]: I1002 20:07:41.522124 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerStarted","Data":"f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d"} Oct 02 20:07:43 crc kubenswrapper[4909]: I1002 20:07:43.548540 4909 generic.go:334] "Generic (PLEG): container finished" podID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerID="f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d" exitCode=0 Oct 02 20:07:43 crc kubenswrapper[4909]: I1002 20:07:43.548646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerDied","Data":"f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d"} Oct 02 20:07:44 crc kubenswrapper[4909]: I1002 20:07:44.572211 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerStarted","Data":"2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb"} Oct 02 20:07:44 crc kubenswrapper[4909]: I1002 20:07:44.607692 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djrpz" podStartSLOduration=4.118014373 podStartE2EDuration="8.607669808s" podCreationTimestamp="2025-10-02 20:07:36 +0000 UTC" firstStartedPulling="2025-10-02 20:07:39.506144609 +0000 UTC m=+6580.693640468" lastFinishedPulling="2025-10-02 20:07:43.995800004 +0000 UTC m=+6585.183295903" observedRunningTime="2025-10-02 20:07:44.599848785 +0000 UTC m=+6585.787344654" watchObservedRunningTime="2025-10-02 20:07:44.607669808 +0000 UTC m=+6585.795165677" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.261529 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r58gs"] Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.264500 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.273068 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r58gs"] Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.340480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28w5z\" (UniqueName: \"kubernetes.io/projected/769c259b-a280-47c4-9c18-ebfeb9ab42c9-kube-api-access-28w5z\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.340716 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-catalog-content\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.340946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-utilities\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.443943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28w5z\" (UniqueName: \"kubernetes.io/projected/769c259b-a280-47c4-9c18-ebfeb9ab42c9-kube-api-access-28w5z\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.444068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-catalog-content\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.444116 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-utilities\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.444686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-catalog-content\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.444779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-utilities\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.463702 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28w5z\" (UniqueName: \"kubernetes.io/projected/769c259b-a280-47c4-9c18-ebfeb9ab42c9-kube-api-access-28w5z\") pod \"redhat-marketplace-r58gs\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:45 crc kubenswrapper[4909]: I1002 20:07:45.600881 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:46 crc kubenswrapper[4909]: I1002 20:07:46.122382 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r58gs"] Oct 02 20:07:46 crc kubenswrapper[4909]: I1002 20:07:46.643860 4909 generic.go:334] "Generic (PLEG): container finished" podID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerID="38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23" exitCode=0 Oct 02 20:07:46 crc kubenswrapper[4909]: I1002 20:07:46.643977 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r58gs" event={"ID":"769c259b-a280-47c4-9c18-ebfeb9ab42c9","Type":"ContainerDied","Data":"38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23"} Oct 02 20:07:46 crc kubenswrapper[4909]: I1002 20:07:46.644157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r58gs" event={"ID":"769c259b-a280-47c4-9c18-ebfeb9ab42c9","Type":"ContainerStarted","Data":"5d22eb5286562b97582f82bad71921c12b3b334a45f687dc8d07425f22151804"} Oct 02 20:07:47 crc kubenswrapper[4909]: I1002 20:07:47.308315 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:47 crc kubenswrapper[4909]: I1002 20:07:47.308595 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:47 crc kubenswrapper[4909]: I1002 20:07:47.609604 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:07:47 crc kubenswrapper[4909]: E1002 20:07:47.609905 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:07:48 crc kubenswrapper[4909]: I1002 20:07:48.364070 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-djrpz" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="registry-server" probeResult="failure" output=< Oct 02 20:07:48 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:07:48 crc kubenswrapper[4909]: > Oct 02 20:07:48 crc kubenswrapper[4909]: I1002 20:07:48.672259 4909 generic.go:334] "Generic (PLEG): container finished" podID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerID="df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e" exitCode=0 Oct 02 20:07:48 crc kubenswrapper[4909]: I1002 20:07:48.672303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r58gs" event={"ID":"769c259b-a280-47c4-9c18-ebfeb9ab42c9","Type":"ContainerDied","Data":"df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e"} Oct 02 20:07:49 crc kubenswrapper[4909]: I1002 20:07:49.686388 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r58gs" event={"ID":"769c259b-a280-47c4-9c18-ebfeb9ab42c9","Type":"ContainerStarted","Data":"0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e"} Oct 02 20:07:49 crc kubenswrapper[4909]: I1002 20:07:49.714841 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r58gs" podStartSLOduration=2.257377033 podStartE2EDuration="4.714818702s" podCreationTimestamp="2025-10-02 20:07:45 +0000 UTC" firstStartedPulling="2025-10-02 20:07:46.651446904 +0000 UTC m=+6587.838942753" lastFinishedPulling="2025-10-02 20:07:49.108888563 +0000 UTC m=+6590.296384422" observedRunningTime="2025-10-02 20:07:49.705605947 +0000 UTC m=+6590.893101846" watchObservedRunningTime="2025-10-02 20:07:49.714818702 +0000 UTC m=+6590.902314571" Oct 02 20:07:55 crc kubenswrapper[4909]: I1002 20:07:55.601575 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:55 crc kubenswrapper[4909]: I1002 20:07:55.602160 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:55 crc kubenswrapper[4909]: I1002 20:07:55.657351 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:55 crc kubenswrapper[4909]: I1002 20:07:55.806744 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:55 crc kubenswrapper[4909]: I1002 20:07:55.906606 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r58gs"] Oct 02 20:07:57 crc kubenswrapper[4909]: I1002 20:07:57.371604 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:57 crc kubenswrapper[4909]: I1002 20:07:57.447645 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:57 crc kubenswrapper[4909]: I1002 20:07:57.767098 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r58gs" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="registry-server" containerID="cri-o://0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e" gracePeriod=2 Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.306081 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djrpz"] Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.457638 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.547925 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28w5z\" (UniqueName: \"kubernetes.io/projected/769c259b-a280-47c4-9c18-ebfeb9ab42c9-kube-api-access-28w5z\") pod \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.548435 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-utilities\") pod \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.548508 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-catalog-content\") pod \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\" (UID: \"769c259b-a280-47c4-9c18-ebfeb9ab42c9\") " Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.550298 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-utilities" (OuterVolumeSpecName: "utilities") pod "769c259b-a280-47c4-9c18-ebfeb9ab42c9" (UID: "769c259b-a280-47c4-9c18-ebfeb9ab42c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.558820 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769c259b-a280-47c4-9c18-ebfeb9ab42c9-kube-api-access-28w5z" (OuterVolumeSpecName: "kube-api-access-28w5z") pod "769c259b-a280-47c4-9c18-ebfeb9ab42c9" (UID: "769c259b-a280-47c4-9c18-ebfeb9ab42c9"). InnerVolumeSpecName "kube-api-access-28w5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.570144 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "769c259b-a280-47c4-9c18-ebfeb9ab42c9" (UID: "769c259b-a280-47c4-9c18-ebfeb9ab42c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.609943 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:07:58 crc kubenswrapper[4909]: E1002 20:07:58.610632 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.651723 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.651765 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769c259b-a280-47c4-9c18-ebfeb9ab42c9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.651781 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28w5z\" (UniqueName: \"kubernetes.io/projected/769c259b-a280-47c4-9c18-ebfeb9ab42c9-kube-api-access-28w5z\") on node \"crc\" DevicePath \"\"" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.782659 4909 generic.go:334] "Generic (PLEG): container finished" podID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerID="0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e" exitCode=0 Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.782762 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r58gs" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.782828 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r58gs" event={"ID":"769c259b-a280-47c4-9c18-ebfeb9ab42c9","Type":"ContainerDied","Data":"0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e"} Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.782877 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r58gs" event={"ID":"769c259b-a280-47c4-9c18-ebfeb9ab42c9","Type":"ContainerDied","Data":"5d22eb5286562b97582f82bad71921c12b3b334a45f687dc8d07425f22151804"} Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.782898 4909 scope.go:117] "RemoveContainer" containerID="0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.782918 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djrpz" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="registry-server" containerID="cri-o://2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb" gracePeriod=2 Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.817695 4909 scope.go:117] "RemoveContainer" containerID="df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.847841 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r58gs"] Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.849769 4909 scope.go:117] "RemoveContainer" containerID="38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23" Oct 02 20:07:58 crc kubenswrapper[4909]: I1002 20:07:58.879971 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r58gs"] Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.011944 4909 scope.go:117] "RemoveContainer" containerID="0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e" Oct 02 20:07:59 crc kubenswrapper[4909]: E1002 20:07:59.036786 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e\": container with ID starting with 0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e not found: ID does not exist" containerID="0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.039076 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e"} err="failed to get container status \"0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e\": rpc error: code = NotFound desc = could not find container \"0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e\": container with ID starting with 0208bd426b1a3eeb8b6e4627b581d032e3fb8008083745c5d2c84767da1dec8e not found: ID does not exist" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.039116 4909 scope.go:117] "RemoveContainer" containerID="df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e" Oct 02 20:07:59 crc kubenswrapper[4909]: E1002 20:07:59.041107 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e\": container with ID starting with df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e not found: ID does not exist" containerID="df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.041131 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e"} err="failed to get container status \"df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e\": rpc error: code = NotFound desc = could not find container \"df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e\": container with ID starting with df0630f8dd15dc641056a39cd23c11ff0c53b6b843744d3220b34e4a1003e00e not found: ID does not exist" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.041148 4909 scope.go:117] "RemoveContainer" containerID="38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23" Oct 02 20:07:59 crc kubenswrapper[4909]: E1002 20:07:59.041980 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23\": container with ID starting with 38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23 not found: ID does not exist" containerID="38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.042010 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23"} err="failed to get container status \"38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23\": rpc error: code = NotFound desc = could not find container \"38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23\": container with ID starting with 38e9bd111b46e747c281fba306206a953a7f5a6ee710e5510358404c98e19b23 not found: ID does not exist" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.522677 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.635815 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" path="/var/lib/kubelet/pods/769c259b-a280-47c4-9c18-ebfeb9ab42c9/volumes" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.691577 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-catalog-content\") pod \"81e5aef7-31fd-48b3-b53c-034995c194fd\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.691781 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4ds2\" (UniqueName: \"kubernetes.io/projected/81e5aef7-31fd-48b3-b53c-034995c194fd-kube-api-access-q4ds2\") pod \"81e5aef7-31fd-48b3-b53c-034995c194fd\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.691910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-utilities\") pod \"81e5aef7-31fd-48b3-b53c-034995c194fd\" (UID: \"81e5aef7-31fd-48b3-b53c-034995c194fd\") " Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.693812 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-utilities" (OuterVolumeSpecName: "utilities") pod "81e5aef7-31fd-48b3-b53c-034995c194fd" (UID: "81e5aef7-31fd-48b3-b53c-034995c194fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.704289 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e5aef7-31fd-48b3-b53c-034995c194fd-kube-api-access-q4ds2" (OuterVolumeSpecName: "kube-api-access-q4ds2") pod "81e5aef7-31fd-48b3-b53c-034995c194fd" (UID: "81e5aef7-31fd-48b3-b53c-034995c194fd"). InnerVolumeSpecName "kube-api-access-q4ds2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.755925 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81e5aef7-31fd-48b3-b53c-034995c194fd" (UID: "81e5aef7-31fd-48b3-b53c-034995c194fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.795394 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.795747 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4ds2\" (UniqueName: \"kubernetes.io/projected/81e5aef7-31fd-48b3-b53c-034995c194fd-kube-api-access-q4ds2\") on node \"crc\" DevicePath \"\"" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.795759 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81e5aef7-31fd-48b3-b53c-034995c194fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.795975 4909 generic.go:334] "Generic (PLEG): container finished" podID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerID="2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb" exitCode=0 Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.796472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerDied","Data":"2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb"} Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.796484 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrpz" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.796606 4909 scope.go:117] "RemoveContainer" containerID="2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.799590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrpz" event={"ID":"81e5aef7-31fd-48b3-b53c-034995c194fd","Type":"ContainerDied","Data":"a406965b8223957cb0f5b927f00ef2cdcfa145f44665bc07aecaf6a608a88a3b"} Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.848179 4909 scope.go:117] "RemoveContainer" containerID="f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.868268 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djrpz"] Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.878737 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djrpz"] Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.880555 4909 scope.go:117] "RemoveContainer" containerID="35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.901645 4909 scope.go:117] "RemoveContainer" containerID="2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb" Oct 02 20:07:59 crc kubenswrapper[4909]: E1002 20:07:59.902056 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb\": container with ID starting with 2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb not found: ID does not exist" containerID="2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.902110 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb"} err="failed to get container status \"2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb\": rpc error: code = NotFound desc = could not find container \"2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb\": container with ID starting with 2523b44f2e3f5d89b8b492dee6e46018aa4a43ec149c5a916d3f778c91287bfb not found: ID does not exist" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.902144 4909 scope.go:117] "RemoveContainer" containerID="f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d" Oct 02 20:07:59 crc kubenswrapper[4909]: E1002 20:07:59.902729 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d\": container with ID starting with f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d not found: ID does not exist" containerID="f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.902805 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d"} err="failed to get container status \"f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d\": rpc error: code = NotFound desc = could not find container \"f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d\": container with ID starting with f6cf8361e3b2bc774d3cad28181f2d2a83d529f265a0cacd19e8bd94f195593d not found: ID does not exist" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.902834 4909 scope.go:117] "RemoveContainer" containerID="35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28" Oct 02 20:07:59 crc kubenswrapper[4909]: E1002 20:07:59.903389 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28\": container with ID starting with 35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28 not found: ID does not exist" containerID="35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28" Oct 02 20:07:59 crc kubenswrapper[4909]: I1002 20:07:59.903425 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28"} err="failed to get container status \"35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28\": rpc error: code = NotFound desc = could not find container \"35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28\": container with ID starting with 35414c15cf7a54efc78475dc96abd648d7efe871a0aa070ac156a6d58105da28 not found: ID does not exist" Oct 02 20:08:01 crc kubenswrapper[4909]: I1002 20:08:01.631335 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" path="/var/lib/kubelet/pods/81e5aef7-31fd-48b3-b53c-034995c194fd/volumes" Oct 02 20:08:11 crc kubenswrapper[4909]: I1002 20:08:11.609956 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:08:11 crc kubenswrapper[4909]: E1002 20:08:11.610619 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:08:24 crc kubenswrapper[4909]: I1002 20:08:24.608296 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:08:24 crc kubenswrapper[4909]: E1002 20:08:24.609096 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:08:35 crc kubenswrapper[4909]: I1002 20:08:35.609525 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:08:35 crc kubenswrapper[4909]: E1002 20:08:35.613240 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:08:48 crc kubenswrapper[4909]: I1002 20:08:48.609424 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:08:48 crc kubenswrapper[4909]: E1002 20:08:48.610252 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:09:00 crc kubenswrapper[4909]: I1002 20:09:00.608635 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:09:00 crc kubenswrapper[4909]: E1002 20:09:00.609317 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:09:14 crc kubenswrapper[4909]: I1002 20:09:14.608721 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:09:14 crc kubenswrapper[4909]: E1002 20:09:14.609408 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:09:28 crc kubenswrapper[4909]: I1002 20:09:28.608729 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:09:28 crc kubenswrapper[4909]: E1002 20:09:28.609818 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:09:39 crc kubenswrapper[4909]: I1002 20:09:39.616300 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:09:39 crc kubenswrapper[4909]: E1002 20:09:39.617147 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:09:51 crc kubenswrapper[4909]: I1002 20:09:51.608316 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:09:51 crc kubenswrapper[4909]: E1002 20:09:51.609486 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:10:02 crc kubenswrapper[4909]: I1002 20:10:02.608432 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:10:02 crc kubenswrapper[4909]: E1002 20:10:02.609321 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:10:15 crc kubenswrapper[4909]: I1002 20:10:15.609167 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:10:15 crc kubenswrapper[4909]: E1002 20:10:15.615556 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:10:30 crc kubenswrapper[4909]: I1002 20:10:30.608945 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:10:30 crc kubenswrapper[4909]: E1002 20:10:30.609918 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:10:44 crc kubenswrapper[4909]: I1002 20:10:44.608817 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:10:44 crc kubenswrapper[4909]: E1002 20:10:44.609660 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:10:57 crc kubenswrapper[4909]: I1002 20:10:57.609320 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:10:58 crc kubenswrapper[4909]: I1002 20:10:58.769128 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87"} Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.374389 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvr6t"] Oct 02 20:12:27 crc kubenswrapper[4909]: E1002 20:12:27.375527 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="extract-content" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.376731 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="extract-content" Oct 02 20:12:27 crc kubenswrapper[4909]: E1002 20:12:27.376773 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="extract-utilities" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.376784 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="extract-utilities" Oct 02 20:12:27 crc kubenswrapper[4909]: E1002 20:12:27.376807 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="registry-server" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.376817 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="registry-server" Oct 02 20:12:27 crc kubenswrapper[4909]: E1002 20:12:27.376856 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="extract-content" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.376865 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="extract-content" Oct 02 20:12:27 crc kubenswrapper[4909]: E1002 20:12:27.376902 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="registry-server" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.376912 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="registry-server" Oct 02 20:12:27 crc kubenswrapper[4909]: E1002 20:12:27.376952 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="extract-utilities" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.376963 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="extract-utilities" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.377356 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="769c259b-a280-47c4-9c18-ebfeb9ab42c9" containerName="registry-server" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.377402 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e5aef7-31fd-48b3-b53c-034995c194fd" containerName="registry-server" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.379815 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.394231 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvr6t"] Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.566243 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4w5b\" (UniqueName: \"kubernetes.io/projected/d898bcfa-927a-4a46-b9d8-393349b83038-kube-api-access-z4w5b\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.566315 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-utilities\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.566436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-catalog-content\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.669579 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4w5b\" (UniqueName: \"kubernetes.io/projected/d898bcfa-927a-4a46-b9d8-393349b83038-kube-api-access-z4w5b\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.669705 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-utilities\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.670316 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-utilities\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.670433 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-catalog-content\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.670808 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-catalog-content\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.692672 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4w5b\" (UniqueName: \"kubernetes.io/projected/d898bcfa-927a-4a46-b9d8-393349b83038-kube-api-access-z4w5b\") pod \"redhat-operators-jvr6t\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:27 crc kubenswrapper[4909]: I1002 20:12:27.717759 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:28 crc kubenswrapper[4909]: I1002 20:12:28.310832 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvr6t"] Oct 02 20:12:28 crc kubenswrapper[4909]: I1002 20:12:28.845252 4909 generic.go:334] "Generic (PLEG): container finished" podID="d898bcfa-927a-4a46-b9d8-393349b83038" containerID="27639680e97f0ba067203bc0c002c831b5ff63fae38da83b4acb716e7b161d28" exitCode=0 Oct 02 20:12:28 crc kubenswrapper[4909]: I1002 20:12:28.845319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerDied","Data":"27639680e97f0ba067203bc0c002c831b5ff63fae38da83b4acb716e7b161d28"} Oct 02 20:12:28 crc kubenswrapper[4909]: I1002 20:12:28.845586 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerStarted","Data":"24853ac45eb02319ade070683ce34c562e0cd0eccb10947f3be055fa1f5ac0a1"} Oct 02 20:12:28 crc kubenswrapper[4909]: I1002 20:12:28.849215 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:12:30 crc kubenswrapper[4909]: I1002 20:12:30.873593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerStarted","Data":"399dac3752229917e22395c208c13706b333cd3b3b9b5e3b544563cbcb4d1c86"} Oct 02 20:12:34 crc kubenswrapper[4909]: I1002 20:12:34.920914 4909 generic.go:334] "Generic (PLEG): container finished" podID="d898bcfa-927a-4a46-b9d8-393349b83038" containerID="399dac3752229917e22395c208c13706b333cd3b3b9b5e3b544563cbcb4d1c86" exitCode=0 Oct 02 20:12:34 crc kubenswrapper[4909]: I1002 20:12:34.921011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerDied","Data":"399dac3752229917e22395c208c13706b333cd3b3b9b5e3b544563cbcb4d1c86"} Oct 02 20:12:35 crc kubenswrapper[4909]: I1002 20:12:35.932843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerStarted","Data":"74297c48c4d4a225864687a0c2a9968a2363b024460c008aaba4c0c430194096"} Oct 02 20:12:35 crc kubenswrapper[4909]: I1002 20:12:35.954993 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvr6t" podStartSLOduration=2.1862451 podStartE2EDuration="8.954970586s" podCreationTimestamp="2025-10-02 20:12:27 +0000 UTC" firstStartedPulling="2025-10-02 20:12:28.84734762 +0000 UTC m=+6870.034843479" lastFinishedPulling="2025-10-02 20:12:35.616073106 +0000 UTC m=+6876.803568965" observedRunningTime="2025-10-02 20:12:35.950377044 +0000 UTC m=+6877.137872923" watchObservedRunningTime="2025-10-02 20:12:35.954970586 +0000 UTC m=+6877.142466455" Oct 02 20:12:37 crc kubenswrapper[4909]: I1002 20:12:37.718667 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:37 crc kubenswrapper[4909]: I1002 20:12:37.719248 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:38 crc kubenswrapper[4909]: I1002 20:12:38.764732 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvr6t" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="registry-server" probeResult="failure" output=< Oct 02 20:12:38 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:12:38 crc kubenswrapper[4909]: > Oct 02 20:12:47 crc kubenswrapper[4909]: I1002 20:12:47.779709 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:47 crc kubenswrapper[4909]: I1002 20:12:47.833763 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:48 crc kubenswrapper[4909]: I1002 20:12:48.021777 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvr6t"] Oct 02 20:12:49 crc kubenswrapper[4909]: I1002 20:12:49.112445 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvr6t" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="registry-server" containerID="cri-o://74297c48c4d4a225864687a0c2a9968a2363b024460c008aaba4c0c430194096" gracePeriod=2 Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.124282 4909 generic.go:334] "Generic (PLEG): container finished" podID="d898bcfa-927a-4a46-b9d8-393349b83038" containerID="74297c48c4d4a225864687a0c2a9968a2363b024460c008aaba4c0c430194096" exitCode=0 Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.125687 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerDied","Data":"74297c48c4d4a225864687a0c2a9968a2363b024460c008aaba4c0c430194096"} Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.126017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvr6t" event={"ID":"d898bcfa-927a-4a46-b9d8-393349b83038","Type":"ContainerDied","Data":"24853ac45eb02319ade070683ce34c562e0cd0eccb10947f3be055fa1f5ac0a1"} Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.126143 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24853ac45eb02319ade070683ce34c562e0cd0eccb10947f3be055fa1f5ac0a1" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.214805 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.342838 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-utilities\") pod \"d898bcfa-927a-4a46-b9d8-393349b83038\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.343048 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4w5b\" (UniqueName: \"kubernetes.io/projected/d898bcfa-927a-4a46-b9d8-393349b83038-kube-api-access-z4w5b\") pod \"d898bcfa-927a-4a46-b9d8-393349b83038\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.343317 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-catalog-content\") pod \"d898bcfa-927a-4a46-b9d8-393349b83038\" (UID: \"d898bcfa-927a-4a46-b9d8-393349b83038\") " Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.350845 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-utilities" (OuterVolumeSpecName: "utilities") pod "d898bcfa-927a-4a46-b9d8-393349b83038" (UID: "d898bcfa-927a-4a46-b9d8-393349b83038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.371289 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d898bcfa-927a-4a46-b9d8-393349b83038-kube-api-access-z4w5b" (OuterVolumeSpecName: "kube-api-access-z4w5b") pod "d898bcfa-927a-4a46-b9d8-393349b83038" (UID: "d898bcfa-927a-4a46-b9d8-393349b83038"). InnerVolumeSpecName "kube-api-access-z4w5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.456241 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.456296 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4w5b\" (UniqueName: \"kubernetes.io/projected/d898bcfa-927a-4a46-b9d8-393349b83038-kube-api-access-z4w5b\") on node \"crc\" DevicePath \"\"" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.493807 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d898bcfa-927a-4a46-b9d8-393349b83038" (UID: "d898bcfa-927a-4a46-b9d8-393349b83038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:12:50 crc kubenswrapper[4909]: I1002 20:12:50.558376 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898bcfa-927a-4a46-b9d8-393349b83038-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:12:51 crc kubenswrapper[4909]: I1002 20:12:51.134749 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvr6t" Oct 02 20:12:51 crc kubenswrapper[4909]: I1002 20:12:51.184910 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvr6t"] Oct 02 20:12:51 crc kubenswrapper[4909]: I1002 20:12:51.195113 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvr6t"] Oct 02 20:12:51 crc kubenswrapper[4909]: I1002 20:12:51.620275 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" path="/var/lib/kubelet/pods/d898bcfa-927a-4a46-b9d8-393349b83038/volumes" Oct 02 20:13:23 crc kubenswrapper[4909]: I1002 20:13:23.055181 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:13:23 crc kubenswrapper[4909]: I1002 20:13:23.057257 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:13:53 crc kubenswrapper[4909]: I1002 20:13:53.054838 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:13:53 crc kubenswrapper[4909]: I1002 20:13:53.056305 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.112683 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c6tf5"] Oct 02 20:13:57 crc kubenswrapper[4909]: E1002 20:13:57.113583 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="registry-server" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.113594 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="registry-server" Oct 02 20:13:57 crc kubenswrapper[4909]: E1002 20:13:57.113605 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="extract-content" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.113611 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="extract-content" Oct 02 20:13:57 crc kubenswrapper[4909]: E1002 20:13:57.113626 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="extract-utilities" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.113632 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="extract-utilities" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.113850 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d898bcfa-927a-4a46-b9d8-393349b83038" containerName="registry-server" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.116056 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.134846 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6tf5"] Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.259566 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-utilities\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.259855 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45r4b\" (UniqueName: \"kubernetes.io/projected/8549e5b8-a681-499b-a22b-0cfdbc61e77a-kube-api-access-45r4b\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.259929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-catalog-content\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.362984 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45r4b\" (UniqueName: \"kubernetes.io/projected/8549e5b8-a681-499b-a22b-0cfdbc61e77a-kube-api-access-45r4b\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.363170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-catalog-content\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.363416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-utilities\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.364495 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-catalog-content\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.364510 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-utilities\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.401610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45r4b\" (UniqueName: \"kubernetes.io/projected/8549e5b8-a681-499b-a22b-0cfdbc61e77a-kube-api-access-45r4b\") pod \"certified-operators-c6tf5\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.454938 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:13:57 crc kubenswrapper[4909]: I1002 20:13:57.948088 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6tf5"] Oct 02 20:13:58 crc kubenswrapper[4909]: I1002 20:13:58.903685 4909 generic.go:334] "Generic (PLEG): container finished" podID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerID="3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece" exitCode=0 Oct 02 20:13:58 crc kubenswrapper[4909]: I1002 20:13:58.903790 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerDied","Data":"3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece"} Oct 02 20:13:58 crc kubenswrapper[4909]: I1002 20:13:58.904642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerStarted","Data":"c10f115df6c8bcda2129cbf53375008fd8bbd8ae9e9011ad745bd69cd99ec20c"} Oct 02 20:14:00 crc kubenswrapper[4909]: I1002 20:14:00.931787 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerStarted","Data":"401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce"} Oct 02 20:14:01 crc kubenswrapper[4909]: I1002 20:14:01.944834 4909 generic.go:334] "Generic (PLEG): container finished" podID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerID="401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce" exitCode=0 Oct 02 20:14:01 crc kubenswrapper[4909]: I1002 20:14:01.944924 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerDied","Data":"401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce"} Oct 02 20:14:02 crc kubenswrapper[4909]: I1002 20:14:02.966570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerStarted","Data":"b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968"} Oct 02 20:14:02 crc kubenswrapper[4909]: I1002 20:14:02.991502 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c6tf5" podStartSLOduration=2.403793757 podStartE2EDuration="5.991484721s" podCreationTimestamp="2025-10-02 20:13:57 +0000 UTC" firstStartedPulling="2025-10-02 20:13:58.906685805 +0000 UTC m=+6960.094181704" lastFinishedPulling="2025-10-02 20:14:02.494376769 +0000 UTC m=+6963.681872668" observedRunningTime="2025-10-02 20:14:02.990996746 +0000 UTC m=+6964.178492695" watchObservedRunningTime="2025-10-02 20:14:02.991484721 +0000 UTC m=+6964.178980580" Oct 02 20:14:07 crc kubenswrapper[4909]: I1002 20:14:07.455243 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:14:07 crc kubenswrapper[4909]: I1002 20:14:07.455688 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:14:07 crc kubenswrapper[4909]: I1002 20:14:07.518952 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:14:08 crc kubenswrapper[4909]: I1002 20:14:08.065369 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:14:08 crc kubenswrapper[4909]: I1002 20:14:08.117131 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6tf5"] Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.034191 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c6tf5" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="registry-server" containerID="cri-o://b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968" gracePeriod=2 Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.520286 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.598086 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-utilities\") pod \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.598237 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-catalog-content\") pod \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.598357 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45r4b\" (UniqueName: \"kubernetes.io/projected/8549e5b8-a681-499b-a22b-0cfdbc61e77a-kube-api-access-45r4b\") pod \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\" (UID: \"8549e5b8-a681-499b-a22b-0cfdbc61e77a\") " Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.600520 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-utilities" (OuterVolumeSpecName: "utilities") pod "8549e5b8-a681-499b-a22b-0cfdbc61e77a" (UID: "8549e5b8-a681-499b-a22b-0cfdbc61e77a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.604075 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8549e5b8-a681-499b-a22b-0cfdbc61e77a-kube-api-access-45r4b" (OuterVolumeSpecName: "kube-api-access-45r4b") pod "8549e5b8-a681-499b-a22b-0cfdbc61e77a" (UID: "8549e5b8-a681-499b-a22b-0cfdbc61e77a"). InnerVolumeSpecName "kube-api-access-45r4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.653600 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8549e5b8-a681-499b-a22b-0cfdbc61e77a" (UID: "8549e5b8-a681-499b-a22b-0cfdbc61e77a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.702335 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.702375 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8549e5b8-a681-499b-a22b-0cfdbc61e77a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:14:10 crc kubenswrapper[4909]: I1002 20:14:10.702426 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45r4b\" (UniqueName: \"kubernetes.io/projected/8549e5b8-a681-499b-a22b-0cfdbc61e77a-kube-api-access-45r4b\") on node \"crc\" DevicePath \"\"" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.052369 4909 generic.go:334] "Generic (PLEG): container finished" podID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerID="b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968" exitCode=0 Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.052411 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerDied","Data":"b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968"} Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.052436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6tf5" event={"ID":"8549e5b8-a681-499b-a22b-0cfdbc61e77a","Type":"ContainerDied","Data":"c10f115df6c8bcda2129cbf53375008fd8bbd8ae9e9011ad745bd69cd99ec20c"} Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.052452 4909 scope.go:117] "RemoveContainer" containerID="b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.052572 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6tf5" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.083870 4909 scope.go:117] "RemoveContainer" containerID="401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.104265 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6tf5"] Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.109458 4909 scope.go:117] "RemoveContainer" containerID="3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.125175 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c6tf5"] Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.186882 4909 scope.go:117] "RemoveContainer" containerID="b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968" Oct 02 20:14:11 crc kubenswrapper[4909]: E1002 20:14:11.187433 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968\": container with ID starting with b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968 not found: ID does not exist" containerID="b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.187460 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968"} err="failed to get container status \"b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968\": rpc error: code = NotFound desc = could not find container \"b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968\": container with ID starting with b04a73787d4d807c80197a19c5ccfd99b25c0623b907c3108c444e0f51aee968 not found: ID does not exist" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.187478 4909 scope.go:117] "RemoveContainer" containerID="401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce" Oct 02 20:14:11 crc kubenswrapper[4909]: E1002 20:14:11.187783 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce\": container with ID starting with 401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce not found: ID does not exist" containerID="401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.187801 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce"} err="failed to get container status \"401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce\": rpc error: code = NotFound desc = could not find container \"401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce\": container with ID starting with 401866add810bed12f93bf671fbaa7c4d319141db245f5afad899a2520ae75ce not found: ID does not exist" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.187815 4909 scope.go:117] "RemoveContainer" containerID="3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece" Oct 02 20:14:11 crc kubenswrapper[4909]: E1002 20:14:11.187988 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece\": container with ID starting with 3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece not found: ID does not exist" containerID="3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.188007 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece"} err="failed to get container status \"3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece\": rpc error: code = NotFound desc = could not find container \"3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece\": container with ID starting with 3c81af988da3f4c0cd8d0abca1fb98101aa391e6590f9ed2b5d02bff0123dece not found: ID does not exist" Oct 02 20:14:11 crc kubenswrapper[4909]: I1002 20:14:11.661157 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" path="/var/lib/kubelet/pods/8549e5b8-a681-499b-a22b-0cfdbc61e77a/volumes" Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.054170 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.054640 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.054685 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.055569 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.055627 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87" gracePeriod=600 Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.232055 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87" exitCode=0 Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.232107 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87"} Oct 02 20:14:23 crc kubenswrapper[4909]: I1002 20:14:23.232151 4909 scope.go:117] "RemoveContainer" containerID="5f63dd0876e9526a467a5fe89e18378366eff57bce0fde69821916bed65c4b58" Oct 02 20:14:23 crc kubenswrapper[4909]: E1002 20:14:23.281966 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31958374_7b04_45be_9509_c51e08f9afe2.slice/crio-281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31958374_7b04_45be_9509_c51e08f9afe2.slice/crio-conmon-281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87.scope\": RecentStats: unable to find data in memory cache]" Oct 02 20:14:24 crc kubenswrapper[4909]: I1002 20:14:24.242242 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc"} Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.212260 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj"] Oct 02 20:15:00 crc kubenswrapper[4909]: E1002 20:15:00.213948 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="extract-content" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.213971 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="extract-content" Oct 02 20:15:00 crc kubenswrapper[4909]: E1002 20:15:00.214023 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="registry-server" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.214047 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="registry-server" Oct 02 20:15:00 crc kubenswrapper[4909]: E1002 20:15:00.214061 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="extract-utilities" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.214071 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="extract-utilities" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.214402 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8549e5b8-a681-499b-a22b-0cfdbc61e77a" containerName="registry-server" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.215805 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.223577 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj"] Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.262461 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.264560 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.359011 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98eadf7c-b5dc-4f29-ac22-67e34149a864-config-volume\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.359173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98eadf7c-b5dc-4f29-ac22-67e34149a864-secret-volume\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.359496 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clsv\" (UniqueName: \"kubernetes.io/projected/98eadf7c-b5dc-4f29-ac22-67e34149a864-kube-api-access-5clsv\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.463941 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98eadf7c-b5dc-4f29-ac22-67e34149a864-config-volume\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.462335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98eadf7c-b5dc-4f29-ac22-67e34149a864-config-volume\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.464471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98eadf7c-b5dc-4f29-ac22-67e34149a864-secret-volume\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.465905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clsv\" (UniqueName: \"kubernetes.io/projected/98eadf7c-b5dc-4f29-ac22-67e34149a864-kube-api-access-5clsv\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.473906 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98eadf7c-b5dc-4f29-ac22-67e34149a864-secret-volume\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.510332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clsv\" (UniqueName: \"kubernetes.io/projected/98eadf7c-b5dc-4f29-ac22-67e34149a864-kube-api-access-5clsv\") pod \"collect-profiles-29323935-q9pwj\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:00 crc kubenswrapper[4909]: I1002 20:15:00.579630 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:01 crc kubenswrapper[4909]: I1002 20:15:01.077802 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj"] Oct 02 20:15:01 crc kubenswrapper[4909]: I1002 20:15:01.734131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" event={"ID":"98eadf7c-b5dc-4f29-ac22-67e34149a864","Type":"ContainerStarted","Data":"fac27311bafcae5d65a3bdd944a0da9b88dcefa8ab47fa8402b70e1fcdfe5a9f"} Oct 02 20:15:01 crc kubenswrapper[4909]: I1002 20:15:01.734452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" event={"ID":"98eadf7c-b5dc-4f29-ac22-67e34149a864","Type":"ContainerStarted","Data":"12dba33c1380126879a40da8bde1c9a6ce2ddd9e7a1f7a5812a45ee9828b300b"} Oct 02 20:15:01 crc kubenswrapper[4909]: I1002 20:15:01.750017 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" podStartSLOduration=1.7499974379999998 podStartE2EDuration="1.749997438s" podCreationTimestamp="2025-10-02 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 20:15:01.748845672 +0000 UTC m=+7022.936341541" watchObservedRunningTime="2025-10-02 20:15:01.749997438 +0000 UTC m=+7022.937493297" Oct 02 20:15:02 crc kubenswrapper[4909]: I1002 20:15:02.767953 4909 generic.go:334] "Generic (PLEG): container finished" podID="98eadf7c-b5dc-4f29-ac22-67e34149a864" containerID="fac27311bafcae5d65a3bdd944a0da9b88dcefa8ab47fa8402b70e1fcdfe5a9f" exitCode=0 Oct 02 20:15:02 crc kubenswrapper[4909]: I1002 20:15:02.768203 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" event={"ID":"98eadf7c-b5dc-4f29-ac22-67e34149a864","Type":"ContainerDied","Data":"fac27311bafcae5d65a3bdd944a0da9b88dcefa8ab47fa8402b70e1fcdfe5a9f"} Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.221555 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.258461 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5clsv\" (UniqueName: \"kubernetes.io/projected/98eadf7c-b5dc-4f29-ac22-67e34149a864-kube-api-access-5clsv\") pod \"98eadf7c-b5dc-4f29-ac22-67e34149a864\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.259940 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98eadf7c-b5dc-4f29-ac22-67e34149a864-secret-volume\") pod \"98eadf7c-b5dc-4f29-ac22-67e34149a864\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.260235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98eadf7c-b5dc-4f29-ac22-67e34149a864-config-volume\") pod \"98eadf7c-b5dc-4f29-ac22-67e34149a864\" (UID: \"98eadf7c-b5dc-4f29-ac22-67e34149a864\") " Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.261619 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98eadf7c-b5dc-4f29-ac22-67e34149a864-config-volume" (OuterVolumeSpecName: "config-volume") pod "98eadf7c-b5dc-4f29-ac22-67e34149a864" (UID: "98eadf7c-b5dc-4f29-ac22-67e34149a864"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.267635 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98eadf7c-b5dc-4f29-ac22-67e34149a864-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "98eadf7c-b5dc-4f29-ac22-67e34149a864" (UID: "98eadf7c-b5dc-4f29-ac22-67e34149a864"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.267869 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98eadf7c-b5dc-4f29-ac22-67e34149a864-kube-api-access-5clsv" (OuterVolumeSpecName: "kube-api-access-5clsv") pod "98eadf7c-b5dc-4f29-ac22-67e34149a864" (UID: "98eadf7c-b5dc-4f29-ac22-67e34149a864"). InnerVolumeSpecName "kube-api-access-5clsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.364247 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5clsv\" (UniqueName: \"kubernetes.io/projected/98eadf7c-b5dc-4f29-ac22-67e34149a864-kube-api-access-5clsv\") on node \"crc\" DevicePath \"\"" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.364295 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98eadf7c-b5dc-4f29-ac22-67e34149a864-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.364313 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98eadf7c-b5dc-4f29-ac22-67e34149a864-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.805731 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" event={"ID":"98eadf7c-b5dc-4f29-ac22-67e34149a864","Type":"ContainerDied","Data":"12dba33c1380126879a40da8bde1c9a6ce2ddd9e7a1f7a5812a45ee9828b300b"} Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.806094 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12dba33c1380126879a40da8bde1c9a6ce2ddd9e7a1f7a5812a45ee9828b300b" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.805982 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323935-q9pwj" Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.854360 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t"] Oct 02 20:15:04 crc kubenswrapper[4909]: I1002 20:15:04.866385 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323890-xrk6t"] Oct 02 20:15:05 crc kubenswrapper[4909]: I1002 20:15:05.620293 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c48ce2-b55d-4a49-a3e7-1fc253d6c668" path="/var/lib/kubelet/pods/75c48ce2-b55d-4a49-a3e7-1fc253d6c668/volumes" Oct 02 20:15:29 crc kubenswrapper[4909]: I1002 20:15:29.666233 4909 scope.go:117] "RemoveContainer" containerID="f0e8da85ea2b32598b2ec2464fa9d7ceec7349d045518219bd41f85cca9429b7" Oct 02 20:16:23 crc kubenswrapper[4909]: I1002 20:16:23.054796 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:16:23 crc kubenswrapper[4909]: I1002 20:16:23.055801 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:16:53 crc kubenswrapper[4909]: I1002 20:16:53.054144 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:16:53 crc kubenswrapper[4909]: I1002 20:16:53.054917 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.054964 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.055551 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.055602 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.056591 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.056654 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" gracePeriod=600 Oct 02 20:17:23 crc kubenswrapper[4909]: E1002 20:17:23.183358 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.523315 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" exitCode=0 Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.523379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc"} Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.523424 4909 scope.go:117] "RemoveContainer" containerID="281f6346f8ea6364c28190843e8e51ca101c7d719026563b2c73bf5fcd702f87" Oct 02 20:17:23 crc kubenswrapper[4909]: I1002 20:17:23.524453 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:17:23 crc kubenswrapper[4909]: E1002 20:17:23.525004 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:17:38 crc kubenswrapper[4909]: I1002 20:17:38.609152 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:17:38 crc kubenswrapper[4909]: E1002 20:17:38.610200 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.068765 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svwz6"] Oct 02 20:17:46 crc kubenswrapper[4909]: E1002 20:17:46.069866 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98eadf7c-b5dc-4f29-ac22-67e34149a864" containerName="collect-profiles" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.069881 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98eadf7c-b5dc-4f29-ac22-67e34149a864" containerName="collect-profiles" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.070282 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="98eadf7c-b5dc-4f29-ac22-67e34149a864" containerName="collect-profiles" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.072232 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.080096 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwz6"] Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.147861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-utilities\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.148356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b8e8adf8-3e20-44f8-882a-eb482058da0f-kube-api-access-xwprn\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.148501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-catalog-content\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.251333 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-utilities\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.251551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b8e8adf8-3e20-44f8-882a-eb482058da0f-kube-api-access-xwprn\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.251637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-catalog-content\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.251978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-utilities\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.252330 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-catalog-content\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.281636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b8e8adf8-3e20-44f8-882a-eb482058da0f-kube-api-access-xwprn\") pod \"redhat-marketplace-svwz6\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:46 crc kubenswrapper[4909]: I1002 20:17:46.407906 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:47 crc kubenswrapper[4909]: I1002 20:17:47.073289 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwz6"] Oct 02 20:17:47 crc kubenswrapper[4909]: I1002 20:17:47.813207 4909 generic.go:334] "Generic (PLEG): container finished" podID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerID="bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe" exitCode=0 Oct 02 20:17:47 crc kubenswrapper[4909]: I1002 20:17:47.813304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwz6" event={"ID":"b8e8adf8-3e20-44f8-882a-eb482058da0f","Type":"ContainerDied","Data":"bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe"} Oct 02 20:17:47 crc kubenswrapper[4909]: I1002 20:17:47.813507 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwz6" event={"ID":"b8e8adf8-3e20-44f8-882a-eb482058da0f","Type":"ContainerStarted","Data":"c63e4cb5eec5fff1757e145873d83d8b5db2b6c53f02b92a9571c68adb289aaf"} Oct 02 20:17:47 crc kubenswrapper[4909]: I1002 20:17:47.816227 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:17:49 crc kubenswrapper[4909]: I1002 20:17:49.858202 4909 generic.go:334] "Generic (PLEG): container finished" podID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerID="883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54" exitCode=0 Oct 02 20:17:49 crc kubenswrapper[4909]: I1002 20:17:49.858843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwz6" event={"ID":"b8e8adf8-3e20-44f8-882a-eb482058da0f","Type":"ContainerDied","Data":"883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54"} Oct 02 20:17:50 crc kubenswrapper[4909]: I1002 20:17:50.870784 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwz6" event={"ID":"b8e8adf8-3e20-44f8-882a-eb482058da0f","Type":"ContainerStarted","Data":"96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb"} Oct 02 20:17:50 crc kubenswrapper[4909]: I1002 20:17:50.903328 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svwz6" podStartSLOduration=2.449690421 podStartE2EDuration="4.90330496s" podCreationTimestamp="2025-10-02 20:17:46 +0000 UTC" firstStartedPulling="2025-10-02 20:17:47.815960528 +0000 UTC m=+7189.003456397" lastFinishedPulling="2025-10-02 20:17:50.269575037 +0000 UTC m=+7191.457070936" observedRunningTime="2025-10-02 20:17:50.886804588 +0000 UTC m=+7192.074300447" watchObservedRunningTime="2025-10-02 20:17:50.90330496 +0000 UTC m=+7192.090800819" Oct 02 20:17:51 crc kubenswrapper[4909]: I1002 20:17:51.609206 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:17:51 crc kubenswrapper[4909]: E1002 20:17:51.609932 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:17:56 crc kubenswrapper[4909]: I1002 20:17:56.408474 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:56 crc kubenswrapper[4909]: I1002 20:17:56.408879 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:56 crc kubenswrapper[4909]: I1002 20:17:56.493852 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:57 crc kubenswrapper[4909]: I1002 20:17:57.038430 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:57 crc kubenswrapper[4909]: I1002 20:17:57.098778 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwz6"] Oct 02 20:17:58 crc kubenswrapper[4909]: I1002 20:17:58.969413 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-svwz6" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="registry-server" containerID="cri-o://96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb" gracePeriod=2 Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.587304 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.700926 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-catalog-content\") pod \"b8e8adf8-3e20-44f8-882a-eb482058da0f\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.700966 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-utilities\") pod \"b8e8adf8-3e20-44f8-882a-eb482058da0f\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.702258 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-utilities" (OuterVolumeSpecName: "utilities") pod "b8e8adf8-3e20-44f8-882a-eb482058da0f" (UID: "b8e8adf8-3e20-44f8-882a-eb482058da0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.702813 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b8e8adf8-3e20-44f8-882a-eb482058da0f-kube-api-access-xwprn\") pod \"b8e8adf8-3e20-44f8-882a-eb482058da0f\" (UID: \"b8e8adf8-3e20-44f8-882a-eb482058da0f\") " Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.703781 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.712763 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e8adf8-3e20-44f8-882a-eb482058da0f-kube-api-access-xwprn" (OuterVolumeSpecName: "kube-api-access-xwprn") pod "b8e8adf8-3e20-44f8-882a-eb482058da0f" (UID: "b8e8adf8-3e20-44f8-882a-eb482058da0f"). InnerVolumeSpecName "kube-api-access-xwprn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.717187 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8e8adf8-3e20-44f8-882a-eb482058da0f" (UID: "b8e8adf8-3e20-44f8-882a-eb482058da0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.806999 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b8e8adf8-3e20-44f8-882a-eb482058da0f-kube-api-access-xwprn\") on node \"crc\" DevicePath \"\"" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.807078 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e8adf8-3e20-44f8-882a-eb482058da0f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.982916 4909 generic.go:334] "Generic (PLEG): container finished" podID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerID="96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb" exitCode=0 Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.982961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwz6" event={"ID":"b8e8adf8-3e20-44f8-882a-eb482058da0f","Type":"ContainerDied","Data":"96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb"} Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.982989 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwz6" Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.982998 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwz6" event={"ID":"b8e8adf8-3e20-44f8-882a-eb482058da0f","Type":"ContainerDied","Data":"c63e4cb5eec5fff1757e145873d83d8b5db2b6c53f02b92a9571c68adb289aaf"} Oct 02 20:17:59 crc kubenswrapper[4909]: I1002 20:17:59.983038 4909 scope.go:117] "RemoveContainer" containerID="96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.025575 4909 scope.go:117] "RemoveContainer" containerID="883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.032201 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwz6"] Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.041350 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwz6"] Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.058789 4909 scope.go:117] "RemoveContainer" containerID="bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.115737 4909 scope.go:117] "RemoveContainer" containerID="96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb" Oct 02 20:18:00 crc kubenswrapper[4909]: E1002 20:18:00.116341 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb\": container with ID starting with 96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb not found: ID does not exist" containerID="96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.116399 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb"} err="failed to get container status \"96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb\": rpc error: code = NotFound desc = could not find container \"96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb\": container with ID starting with 96a64873ac51d5ccb54b7ed7984a10f8490fd341d698981462e4c320b01ceddb not found: ID does not exist" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.116439 4909 scope.go:117] "RemoveContainer" containerID="883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54" Oct 02 20:18:00 crc kubenswrapper[4909]: E1002 20:18:00.116789 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54\": container with ID starting with 883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54 not found: ID does not exist" containerID="883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.116831 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54"} err="failed to get container status \"883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54\": rpc error: code = NotFound desc = could not find container \"883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54\": container with ID starting with 883d70d03248d080150647195c2136ce76b6192a4fb81031a0981c5a5960dc54 not found: ID does not exist" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.116859 4909 scope.go:117] "RemoveContainer" containerID="bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe" Oct 02 20:18:00 crc kubenswrapper[4909]: E1002 20:18:00.117369 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe\": container with ID starting with bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe not found: ID does not exist" containerID="bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe" Oct 02 20:18:00 crc kubenswrapper[4909]: I1002 20:18:00.117405 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe"} err="failed to get container status \"bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe\": rpc error: code = NotFound desc = could not find container \"bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe\": container with ID starting with bd6cb727c6980112d3f28f2c95fce45baf02425aa74a43734d4f2154ca24fafe not found: ID does not exist" Oct 02 20:18:01 crc kubenswrapper[4909]: I1002 20:18:01.655321 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" path="/var/lib/kubelet/pods/b8e8adf8-3e20-44f8-882a-eb482058da0f/volumes" Oct 02 20:18:03 crc kubenswrapper[4909]: I1002 20:18:03.609066 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:18:03 crc kubenswrapper[4909]: E1002 20:18:03.609811 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:18:17 crc kubenswrapper[4909]: I1002 20:18:17.608579 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:18:17 crc kubenswrapper[4909]: E1002 20:18:17.610071 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:18:28 crc kubenswrapper[4909]: I1002 20:18:28.611135 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:18:28 crc kubenswrapper[4909]: E1002 20:18:28.613668 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:18:29 crc kubenswrapper[4909]: I1002 20:18:29.775999 4909 scope.go:117] "RemoveContainer" containerID="27639680e97f0ba067203bc0c002c831b5ff63fae38da83b4acb716e7b161d28" Oct 02 20:18:29 crc kubenswrapper[4909]: I1002 20:18:29.814517 4909 scope.go:117] "RemoveContainer" containerID="399dac3752229917e22395c208c13706b333cd3b3b9b5e3b544563cbcb4d1c86" Oct 02 20:18:42 crc kubenswrapper[4909]: I1002 20:18:42.608740 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:18:42 crc kubenswrapper[4909]: E1002 20:18:42.609752 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:18:55 crc kubenswrapper[4909]: I1002 20:18:55.608593 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:18:55 crc kubenswrapper[4909]: E1002 20:18:55.609629 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:19:06 crc kubenswrapper[4909]: I1002 20:19:06.609111 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:19:06 crc kubenswrapper[4909]: E1002 20:19:06.610294 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:19:17 crc kubenswrapper[4909]: I1002 20:19:17.610012 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:19:17 crc kubenswrapper[4909]: E1002 20:19:17.610799 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:19:29 crc kubenswrapper[4909]: I1002 20:19:29.912651 4909 scope.go:117] "RemoveContainer" containerID="74297c48c4d4a225864687a0c2a9968a2363b024460c008aaba4c0c430194096" Oct 02 20:19:32 crc kubenswrapper[4909]: I1002 20:19:32.610395 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:19:32 crc kubenswrapper[4909]: E1002 20:19:32.611147 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:19:45 crc kubenswrapper[4909]: I1002 20:19:45.609061 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:19:45 crc kubenswrapper[4909]: E1002 20:19:45.610153 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:19:59 crc kubenswrapper[4909]: I1002 20:19:59.616870 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:19:59 crc kubenswrapper[4909]: E1002 20:19:59.618017 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.269896 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bqwgf"] Oct 02 20:20:09 crc kubenswrapper[4909]: E1002 20:20:09.270978 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="extract-content" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.270992 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="extract-content" Oct 02 20:20:09 crc kubenswrapper[4909]: E1002 20:20:09.271008 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="registry-server" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.271015 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="registry-server" Oct 02 20:20:09 crc kubenswrapper[4909]: E1002 20:20:09.271056 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="extract-utilities" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.271063 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="extract-utilities" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.271258 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e8adf8-3e20-44f8-882a-eb482058da0f" containerName="registry-server" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.273004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.293154 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqwgf"] Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.409516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-utilities\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.409564 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsjm\" (UniqueName: \"kubernetes.io/projected/349b97ce-4849-41c5-93a3-9bd37fe1c6db-kube-api-access-wvsjm\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.409777 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-catalog-content\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.511814 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-catalog-content\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.512311 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-utilities\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.512436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsjm\" (UniqueName: \"kubernetes.io/projected/349b97ce-4849-41c5-93a3-9bd37fe1c6db-kube-api-access-wvsjm\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.512612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-utilities\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.512698 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-catalog-content\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.535521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsjm\" (UniqueName: \"kubernetes.io/projected/349b97ce-4849-41c5-93a3-9bd37fe1c6db-kube-api-access-wvsjm\") pod \"community-operators-bqwgf\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:09 crc kubenswrapper[4909]: I1002 20:20:09.638961 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:10 crc kubenswrapper[4909]: I1002 20:20:10.154545 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqwgf"] Oct 02 20:20:10 crc kubenswrapper[4909]: I1002 20:20:10.668728 4909 generic.go:334] "Generic (PLEG): container finished" podID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerID="6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4" exitCode=0 Oct 02 20:20:10 crc kubenswrapper[4909]: I1002 20:20:10.668813 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerDied","Data":"6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4"} Oct 02 20:20:10 crc kubenswrapper[4909]: I1002 20:20:10.669178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerStarted","Data":"675545367c7e69d094ae709ae87c3513623eaa58de462ca6e2d65e6fcbfaa832"} Oct 02 20:20:11 crc kubenswrapper[4909]: I1002 20:20:11.690042 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerStarted","Data":"32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f"} Oct 02 20:20:12 crc kubenswrapper[4909]: I1002 20:20:12.608564 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:20:12 crc kubenswrapper[4909]: E1002 20:20:12.609090 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:20:12 crc kubenswrapper[4909]: I1002 20:20:12.704853 4909 generic.go:334] "Generic (PLEG): container finished" podID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerID="32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f" exitCode=0 Oct 02 20:20:12 crc kubenswrapper[4909]: I1002 20:20:12.704897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerDied","Data":"32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f"} Oct 02 20:20:13 crc kubenswrapper[4909]: I1002 20:20:13.717159 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerStarted","Data":"b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999"} Oct 02 20:20:13 crc kubenswrapper[4909]: I1002 20:20:13.745689 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bqwgf" podStartSLOduration=2.324015528 podStartE2EDuration="4.745666015s" podCreationTimestamp="2025-10-02 20:20:09 +0000 UTC" firstStartedPulling="2025-10-02 20:20:10.673429726 +0000 UTC m=+7331.860925615" lastFinishedPulling="2025-10-02 20:20:13.095080253 +0000 UTC m=+7334.282576102" observedRunningTime="2025-10-02 20:20:13.736061607 +0000 UTC m=+7334.923557496" watchObservedRunningTime="2025-10-02 20:20:13.745666015 +0000 UTC m=+7334.933161894" Oct 02 20:20:19 crc kubenswrapper[4909]: I1002 20:20:19.639855 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:19 crc kubenswrapper[4909]: I1002 20:20:19.640434 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:19 crc kubenswrapper[4909]: I1002 20:20:19.733845 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:19 crc kubenswrapper[4909]: I1002 20:20:19.883668 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:19 crc kubenswrapper[4909]: I1002 20:20:19.985962 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bqwgf"] Oct 02 20:20:21 crc kubenswrapper[4909]: I1002 20:20:21.819975 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bqwgf" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="registry-server" containerID="cri-o://b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999" gracePeriod=2 Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.408748 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.563474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-catalog-content\") pod \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.563690 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsjm\" (UniqueName: \"kubernetes.io/projected/349b97ce-4849-41c5-93a3-9bd37fe1c6db-kube-api-access-wvsjm\") pod \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.563919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-utilities\") pod \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\" (UID: \"349b97ce-4849-41c5-93a3-9bd37fe1c6db\") " Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.564730 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-utilities" (OuterVolumeSpecName: "utilities") pod "349b97ce-4849-41c5-93a3-9bd37fe1c6db" (UID: "349b97ce-4849-41c5-93a3-9bd37fe1c6db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.565294 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.571346 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349b97ce-4849-41c5-93a3-9bd37fe1c6db-kube-api-access-wvsjm" (OuterVolumeSpecName: "kube-api-access-wvsjm") pod "349b97ce-4849-41c5-93a3-9bd37fe1c6db" (UID: "349b97ce-4849-41c5-93a3-9bd37fe1c6db"). InnerVolumeSpecName "kube-api-access-wvsjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.614968 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "349b97ce-4849-41c5-93a3-9bd37fe1c6db" (UID: "349b97ce-4849-41c5-93a3-9bd37fe1c6db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.668283 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsjm\" (UniqueName: \"kubernetes.io/projected/349b97ce-4849-41c5-93a3-9bd37fe1c6db-kube-api-access-wvsjm\") on node \"crc\" DevicePath \"\"" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.668334 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349b97ce-4849-41c5-93a3-9bd37fe1c6db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.833231 4909 generic.go:334] "Generic (PLEG): container finished" podID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerID="b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999" exitCode=0 Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.833283 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerDied","Data":"b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999"} Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.833317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqwgf" event={"ID":"349b97ce-4849-41c5-93a3-9bd37fe1c6db","Type":"ContainerDied","Data":"675545367c7e69d094ae709ae87c3513623eaa58de462ca6e2d65e6fcbfaa832"} Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.833327 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqwgf" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.833340 4909 scope.go:117] "RemoveContainer" containerID="b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.870067 4909 scope.go:117] "RemoveContainer" containerID="32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.881809 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bqwgf"] Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.897891 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bqwgf"] Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.909541 4909 scope.go:117] "RemoveContainer" containerID="6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.971375 4909 scope.go:117] "RemoveContainer" containerID="b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999" Oct 02 20:20:22 crc kubenswrapper[4909]: E1002 20:20:22.971969 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999\": container with ID starting with b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999 not found: ID does not exist" containerID="b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.972059 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999"} err="failed to get container status \"b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999\": rpc error: code = NotFound desc = could not find container \"b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999\": container with ID starting with b1575aaa176714145d2ebdaf4d55f1d67a799a0fec2a832fced6fc76e099c999 not found: ID does not exist" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.972098 4909 scope.go:117] "RemoveContainer" containerID="32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f" Oct 02 20:20:22 crc kubenswrapper[4909]: E1002 20:20:22.972556 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f\": container with ID starting with 32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f not found: ID does not exist" containerID="32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.972618 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f"} err="failed to get container status \"32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f\": rpc error: code = NotFound desc = could not find container \"32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f\": container with ID starting with 32bc4be5f3dc7e5fe73be526efd8b087f7cea684f558d11de3f363bac6f3831f not found: ID does not exist" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.972659 4909 scope.go:117] "RemoveContainer" containerID="6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4" Oct 02 20:20:22 crc kubenswrapper[4909]: E1002 20:20:22.972980 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4\": container with ID starting with 6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4 not found: ID does not exist" containerID="6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4" Oct 02 20:20:22 crc kubenswrapper[4909]: I1002 20:20:22.973011 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4"} err="failed to get container status \"6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4\": rpc error: code = NotFound desc = could not find container \"6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4\": container with ID starting with 6b007708e88d98204719ab79a9f983ae342745f326ca2c163b341b7f361f86d4 not found: ID does not exist" Oct 02 20:20:23 crc kubenswrapper[4909]: I1002 20:20:23.609676 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:20:23 crc kubenswrapper[4909]: E1002 20:20:23.610243 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:20:23 crc kubenswrapper[4909]: I1002 20:20:23.631162 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" path="/var/lib/kubelet/pods/349b97ce-4849-41c5-93a3-9bd37fe1c6db/volumes" Oct 02 20:20:35 crc kubenswrapper[4909]: I1002 20:20:35.609085 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:20:35 crc kubenswrapper[4909]: E1002 20:20:35.609953 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:20:46 crc kubenswrapper[4909]: I1002 20:20:46.608734 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:20:46 crc kubenswrapper[4909]: E1002 20:20:46.609753 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:21:00 crc kubenswrapper[4909]: I1002 20:21:00.610927 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:21:00 crc kubenswrapper[4909]: E1002 20:21:00.611828 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:21:11 crc kubenswrapper[4909]: I1002 20:21:11.609130 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:21:11 crc kubenswrapper[4909]: E1002 20:21:11.609980 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:21:26 crc kubenswrapper[4909]: I1002 20:21:26.609369 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:21:26 crc kubenswrapper[4909]: E1002 20:21:26.610432 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:21:39 crc kubenswrapper[4909]: I1002 20:21:39.624051 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:21:39 crc kubenswrapper[4909]: E1002 20:21:39.624904 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:21:51 crc kubenswrapper[4909]: I1002 20:21:51.614154 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:21:51 crc kubenswrapper[4909]: E1002 20:21:51.615799 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:22:05 crc kubenswrapper[4909]: I1002 20:22:05.609259 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:22:05 crc kubenswrapper[4909]: E1002 20:22:05.610288 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:22:19 crc kubenswrapper[4909]: I1002 20:22:19.290926 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c3787cb-4303-4e47-aa85-ba12c768c729" containerID="5baec6969c890c8f56d69560e674e791ba43e6041a6ff0b29a64417fe5585ee1" exitCode=0 Oct 02 20:22:19 crc kubenswrapper[4909]: I1002 20:22:19.290968 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5c3787cb-4303-4e47-aa85-ba12c768c729","Type":"ContainerDied","Data":"5baec6969c890c8f56d69560e674e791ba43e6041a6ff0b29a64417fe5585ee1"} Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.609242 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:22:20 crc kubenswrapper[4909]: E1002 20:22:20.609938 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.725625 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.872841 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.872913 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-temporary\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873043 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config-secret\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873104 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ssh-key\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873130 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ca-certs\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873197 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl58p\" (UniqueName: \"kubernetes.io/projected/5c3787cb-4303-4e47-aa85-ba12c768c729-kube-api-access-cl58p\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873372 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873415 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-config-data\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.873464 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-workdir\") pod \"5c3787cb-4303-4e47-aa85-ba12c768c729\" (UID: \"5c3787cb-4303-4e47-aa85-ba12c768c729\") " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.877096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.878592 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-config-data" (OuterVolumeSpecName: "config-data") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.881145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3787cb-4303-4e47-aa85-ba12c768c729-kube-api-access-cl58p" (OuterVolumeSpecName: "kube-api-access-cl58p") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "kube-api-access-cl58p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.887009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.891648 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.933056 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.950296 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.955637 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.975572 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.975606 4909 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.978824 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.978855 4909 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c3787cb-4303-4e47-aa85-ba12c768c729-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.978866 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.978875 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.978883 4909 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c3787cb-4303-4e47-aa85-ba12c768c729-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:20 crc kubenswrapper[4909]: I1002 20:22:20.978891 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl58p\" (UniqueName: \"kubernetes.io/projected/5c3787cb-4303-4e47-aa85-ba12c768c729-kube-api-access-cl58p\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.016170 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.028937 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5c3787cb-4303-4e47-aa85-ba12c768c729" (UID: "5c3787cb-4303-4e47-aa85-ba12c768c729"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.081326 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c3787cb-4303-4e47-aa85-ba12c768c729-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.081352 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.327827 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5c3787cb-4303-4e47-aa85-ba12c768c729","Type":"ContainerDied","Data":"25547879b4fe41330559432409a5e1281c55f8b3635b6b46db0f49691db87512"} Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.327874 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25547879b4fe41330559432409a5e1281c55f8b3635b6b46db0f49691db87512" Oct 02 20:22:21 crc kubenswrapper[4909]: I1002 20:22:21.327917 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.012461 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 20:22:32 crc kubenswrapper[4909]: E1002 20:22:32.013817 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="registry-server" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.013844 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="registry-server" Oct 02 20:22:32 crc kubenswrapper[4909]: E1002 20:22:32.013889 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3787cb-4303-4e47-aa85-ba12c768c729" containerName="tempest-tests-tempest-tests-runner" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.013907 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3787cb-4303-4e47-aa85-ba12c768c729" containerName="tempest-tests-tempest-tests-runner" Oct 02 20:22:32 crc kubenswrapper[4909]: E1002 20:22:32.013933 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="extract-content" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.013947 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="extract-content" Oct 02 20:22:32 crc kubenswrapper[4909]: E1002 20:22:32.013993 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="extract-utilities" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.014008 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="extract-utilities" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.014426 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="349b97ce-4849-41c5-93a3-9bd37fe1c6db" containerName="registry-server" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.014450 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3787cb-4303-4e47-aa85-ba12c768c729" containerName="tempest-tests-tempest-tests-runner" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.015776 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.027375 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7v74j" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.030599 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.141118 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.141187 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648br\" (UniqueName: \"kubernetes.io/projected/69cde6e6-c884-4a30-a68b-a584dc3c370d-kube-api-access-648br\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.243120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.243223 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-648br\" (UniqueName: \"kubernetes.io/projected/69cde6e6-c884-4a30-a68b-a584dc3c370d-kube-api-access-648br\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.247173 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.273124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-648br\" (UniqueName: \"kubernetes.io/projected/69cde6e6-c884-4a30-a68b-a584dc3c370d-kube-api-access-648br\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.298751 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69cde6e6-c884-4a30-a68b-a584dc3c370d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.355300 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 20:22:32 crc kubenswrapper[4909]: I1002 20:22:32.960119 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 20:22:32 crc kubenswrapper[4909]: W1002 20:22:32.970200 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69cde6e6_c884_4a30_a68b_a584dc3c370d.slice/crio-5e5492b7b0916f27d70a6af84efc03aea67d16daa535da5ad1e8588b249039fd WatchSource:0}: Error finding container 5e5492b7b0916f27d70a6af84efc03aea67d16daa535da5ad1e8588b249039fd: Status 404 returned error can't find the container with id 5e5492b7b0916f27d70a6af84efc03aea67d16daa535da5ad1e8588b249039fd Oct 02 20:22:33 crc kubenswrapper[4909]: I1002 20:22:33.502824 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"69cde6e6-c884-4a30-a68b-a584dc3c370d","Type":"ContainerStarted","Data":"5e5492b7b0916f27d70a6af84efc03aea67d16daa535da5ad1e8588b249039fd"} Oct 02 20:22:35 crc kubenswrapper[4909]: I1002 20:22:35.534787 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"69cde6e6-c884-4a30-a68b-a584dc3c370d","Type":"ContainerStarted","Data":"1394543c85a2211a8e8ba883e56ef0f2bce8d39c1f1a507fb69bc30f8df37456"} Oct 02 20:22:35 crc kubenswrapper[4909]: I1002 20:22:35.564693 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.233079363 podStartE2EDuration="4.564669738s" podCreationTimestamp="2025-10-02 20:22:31 +0000 UTC" firstStartedPulling="2025-10-02 20:22:32.972850475 +0000 UTC m=+7474.160346344" lastFinishedPulling="2025-10-02 20:22:34.30444085 +0000 UTC m=+7475.491936719" observedRunningTime="2025-10-02 20:22:35.552231013 +0000 UTC m=+7476.739726882" watchObservedRunningTime="2025-10-02 20:22:35.564669738 +0000 UTC m=+7476.752165627" Oct 02 20:22:35 crc kubenswrapper[4909]: I1002 20:22:35.608493 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:22:36 crc kubenswrapper[4909]: I1002 20:22:36.561577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"fc55a90cc76e5997f77ce636d058ce66c6d73901eb7260d4b6b8cf7f52b9ebe1"} Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.166763 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rxh5m/must-gather-pndst"] Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.169102 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.171021 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rxh5m"/"default-dockercfg-p42rc" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.171115 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rxh5m"/"openshift-service-ca.crt" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.171452 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rxh5m"/"kube-root-ca.crt" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.177926 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rxh5m/must-gather-pndst"] Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.275388 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-must-gather-output\") pod \"must-gather-pndst\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.276171 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mln5s\" (UniqueName: \"kubernetes.io/projected/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-kube-api-access-mln5s\") pod \"must-gather-pndst\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.378233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-must-gather-output\") pod \"must-gather-pndst\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.378292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mln5s\" (UniqueName: \"kubernetes.io/projected/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-kube-api-access-mln5s\") pod \"must-gather-pndst\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.378676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-must-gather-output\") pod \"must-gather-pndst\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.406139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mln5s\" (UniqueName: \"kubernetes.io/projected/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-kube-api-access-mln5s\") pod \"must-gather-pndst\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:57 crc kubenswrapper[4909]: I1002 20:22:57.546294 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:22:58 crc kubenswrapper[4909]: W1002 20:22:58.021974 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bbce0e4_7c78_4f5c_a4b3_17fa147db670.slice/crio-26b84d6fa76990f09b1675a95aba51daf023ee5778c40204bf9f73363e8468c5 WatchSource:0}: Error finding container 26b84d6fa76990f09b1675a95aba51daf023ee5778c40204bf9f73363e8468c5: Status 404 returned error can't find the container with id 26b84d6fa76990f09b1675a95aba51daf023ee5778c40204bf9f73363e8468c5 Oct 02 20:22:58 crc kubenswrapper[4909]: I1002 20:22:58.024549 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rxh5m/must-gather-pndst"] Oct 02 20:22:58 crc kubenswrapper[4909]: I1002 20:22:58.025897 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:22:58 crc kubenswrapper[4909]: I1002 20:22:58.858822 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/must-gather-pndst" event={"ID":"6bbce0e4-7c78-4f5c-a4b3-17fa147db670","Type":"ContainerStarted","Data":"26b84d6fa76990f09b1675a95aba51daf023ee5778c40204bf9f73363e8468c5"} Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.484455 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-px8vp"] Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.490781 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.498012 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-px8vp"] Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.592120 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-catalog-content\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.592449 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-utilities\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.592693 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxgmc\" (UniqueName: \"kubernetes.io/projected/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-kube-api-access-mxgmc\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.695644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxgmc\" (UniqueName: \"kubernetes.io/projected/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-kube-api-access-mxgmc\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.695999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-catalog-content\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.696167 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-utilities\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.696374 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-catalog-content\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.696623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-utilities\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.713848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxgmc\" (UniqueName: \"kubernetes.io/projected/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-kube-api-access-mxgmc\") pod \"redhat-operators-px8vp\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:01 crc kubenswrapper[4909]: I1002 20:23:01.817036 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:02 crc kubenswrapper[4909]: I1002 20:23:02.966535 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-px8vp"] Oct 02 20:23:02 crc kubenswrapper[4909]: W1002 20:23:02.988160 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecb963fa_bb63_4c76_9047_9ccff1adc8ed.slice/crio-092f6781877df5010f350c2f0f0488385a3febc1b925bbeac10d1d9782ca073a WatchSource:0}: Error finding container 092f6781877df5010f350c2f0f0488385a3febc1b925bbeac10d1d9782ca073a: Status 404 returned error can't find the container with id 092f6781877df5010f350c2f0f0488385a3febc1b925bbeac10d1d9782ca073a Oct 02 20:23:03 crc kubenswrapper[4909]: I1002 20:23:03.932088 4909 generic.go:334] "Generic (PLEG): container finished" podID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerID="4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd" exitCode=0 Oct 02 20:23:03 crc kubenswrapper[4909]: I1002 20:23:03.932199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerDied","Data":"4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd"} Oct 02 20:23:03 crc kubenswrapper[4909]: I1002 20:23:03.932738 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerStarted","Data":"092f6781877df5010f350c2f0f0488385a3febc1b925bbeac10d1d9782ca073a"} Oct 02 20:23:03 crc kubenswrapper[4909]: I1002 20:23:03.935992 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/must-gather-pndst" event={"ID":"6bbce0e4-7c78-4f5c-a4b3-17fa147db670","Type":"ContainerStarted","Data":"d482af58ab3304b42829015ef141ce0d16d6182d72ad2e24852414aba5ee642b"} Oct 02 20:23:03 crc kubenswrapper[4909]: I1002 20:23:03.936462 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/must-gather-pndst" event={"ID":"6bbce0e4-7c78-4f5c-a4b3-17fa147db670","Type":"ContainerStarted","Data":"64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6"} Oct 02 20:23:03 crc kubenswrapper[4909]: I1002 20:23:03.981390 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rxh5m/must-gather-pndst" podStartSLOduration=2.502536224 podStartE2EDuration="6.981369839s" podCreationTimestamp="2025-10-02 20:22:57 +0000 UTC" firstStartedPulling="2025-10-02 20:22:58.025694304 +0000 UTC m=+7499.213190163" lastFinishedPulling="2025-10-02 20:23:02.504527919 +0000 UTC m=+7503.692023778" observedRunningTime="2025-10-02 20:23:03.981170633 +0000 UTC m=+7505.168666502" watchObservedRunningTime="2025-10-02 20:23:03.981369839 +0000 UTC m=+7505.168865718" Oct 02 20:23:05 crc kubenswrapper[4909]: I1002 20:23:05.965461 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerStarted","Data":"ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825"} Oct 02 20:23:09 crc kubenswrapper[4909]: I1002 20:23:09.034948 4909 generic.go:334] "Generic (PLEG): container finished" podID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerID="ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825" exitCode=0 Oct 02 20:23:09 crc kubenswrapper[4909]: I1002 20:23:09.035542 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerDied","Data":"ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825"} Oct 02 20:23:09 crc kubenswrapper[4909]: I1002 20:23:09.917041 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-8xdnp"] Oct 02 20:23:09 crc kubenswrapper[4909]: I1002 20:23:09.918951 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:09 crc kubenswrapper[4909]: I1002 20:23:09.985228 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh99h\" (UniqueName: \"kubernetes.io/projected/e9bb2caf-009e-4be2-9691-7780a963b552-kube-api-access-zh99h\") pod \"crc-debug-8xdnp\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:09 crc kubenswrapper[4909]: I1002 20:23:09.985287 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9bb2caf-009e-4be2-9691-7780a963b552-host\") pod \"crc-debug-8xdnp\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.047880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerStarted","Data":"717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe"} Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.075114 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-px8vp" podStartSLOduration=3.306129557 podStartE2EDuration="9.075090196s" podCreationTimestamp="2025-10-02 20:23:01 +0000 UTC" firstStartedPulling="2025-10-02 20:23:03.935752237 +0000 UTC m=+7505.123248126" lastFinishedPulling="2025-10-02 20:23:09.704712896 +0000 UTC m=+7510.892208765" observedRunningTime="2025-10-02 20:23:10.069285677 +0000 UTC m=+7511.256781536" watchObservedRunningTime="2025-10-02 20:23:10.075090196 +0000 UTC m=+7511.262586055" Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.087319 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh99h\" (UniqueName: \"kubernetes.io/projected/e9bb2caf-009e-4be2-9691-7780a963b552-kube-api-access-zh99h\") pod \"crc-debug-8xdnp\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.087370 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9bb2caf-009e-4be2-9691-7780a963b552-host\") pod \"crc-debug-8xdnp\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.088281 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9bb2caf-009e-4be2-9691-7780a963b552-host\") pod \"crc-debug-8xdnp\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.112798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh99h\" (UniqueName: \"kubernetes.io/projected/e9bb2caf-009e-4be2-9691-7780a963b552-kube-api-access-zh99h\") pod \"crc-debug-8xdnp\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:10 crc kubenswrapper[4909]: I1002 20:23:10.272125 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:23:11 crc kubenswrapper[4909]: I1002 20:23:11.058093 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" event={"ID":"e9bb2caf-009e-4be2-9691-7780a963b552","Type":"ContainerStarted","Data":"c80d3728e0aba1911d6f514978c226b11fcbb8b5f27247db75004889a88eb6ab"} Oct 02 20:23:11 crc kubenswrapper[4909]: I1002 20:23:11.817225 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:11 crc kubenswrapper[4909]: I1002 20:23:11.817279 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:12 crc kubenswrapper[4909]: I1002 20:23:12.879918 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-px8vp" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" probeResult="failure" output=< Oct 02 20:23:12 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:23:12 crc kubenswrapper[4909]: > Oct 02 20:23:22 crc kubenswrapper[4909]: I1002 20:23:22.183637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" event={"ID":"e9bb2caf-009e-4be2-9691-7780a963b552","Type":"ContainerStarted","Data":"074ebb343af9089d2a8809c8487ec87bede8576b785402ef1f31fb24bbe8529f"} Oct 02 20:23:22 crc kubenswrapper[4909]: I1002 20:23:22.206195 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" podStartSLOduration=1.7249404149999998 podStartE2EDuration="13.206176777s" podCreationTimestamp="2025-10-02 20:23:09 +0000 UTC" firstStartedPulling="2025-10-02 20:23:10.331772779 +0000 UTC m=+7511.519268638" lastFinishedPulling="2025-10-02 20:23:21.813009141 +0000 UTC m=+7523.000505000" observedRunningTime="2025-10-02 20:23:22.203440293 +0000 UTC m=+7523.390936152" watchObservedRunningTime="2025-10-02 20:23:22.206176777 +0000 UTC m=+7523.393672636" Oct 02 20:23:22 crc kubenswrapper[4909]: I1002 20:23:22.872064 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-px8vp" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" probeResult="failure" output=< Oct 02 20:23:22 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:23:22 crc kubenswrapper[4909]: > Oct 02 20:23:32 crc kubenswrapper[4909]: I1002 20:23:32.868750 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-px8vp" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" probeResult="failure" output=< Oct 02 20:23:32 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:23:32 crc kubenswrapper[4909]: > Oct 02 20:23:41 crc kubenswrapper[4909]: I1002 20:23:41.911852 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:42 crc kubenswrapper[4909]: I1002 20:23:42.000951 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:42 crc kubenswrapper[4909]: I1002 20:23:42.158407 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-px8vp"] Oct 02 20:23:43 crc kubenswrapper[4909]: I1002 20:23:43.426806 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-px8vp" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" containerID="cri-o://717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe" gracePeriod=2 Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.206374 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.327704 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-utilities\") pod \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.327840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxgmc\" (UniqueName: \"kubernetes.io/projected/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-kube-api-access-mxgmc\") pod \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.327877 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-catalog-content\") pod \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\" (UID: \"ecb963fa-bb63-4c76-9047-9ccff1adc8ed\") " Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.329286 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-utilities" (OuterVolumeSpecName: "utilities") pod "ecb963fa-bb63-4c76-9047-9ccff1adc8ed" (UID: "ecb963fa-bb63-4c76-9047-9ccff1adc8ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.350912 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-kube-api-access-mxgmc" (OuterVolumeSpecName: "kube-api-access-mxgmc") pod "ecb963fa-bb63-4c76-9047-9ccff1adc8ed" (UID: "ecb963fa-bb63-4c76-9047-9ccff1adc8ed"). InnerVolumeSpecName "kube-api-access-mxgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.427315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecb963fa-bb63-4c76-9047-9ccff1adc8ed" (UID: "ecb963fa-bb63-4c76-9047-9ccff1adc8ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.431101 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.431318 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxgmc\" (UniqueName: \"kubernetes.io/projected/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-kube-api-access-mxgmc\") on node \"crc\" DevicePath \"\"" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.431427 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb963fa-bb63-4c76-9047-9ccff1adc8ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.446934 4909 generic.go:334] "Generic (PLEG): container finished" podID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerID="717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe" exitCode=0 Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.446974 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerDied","Data":"717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe"} Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.446998 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px8vp" event={"ID":"ecb963fa-bb63-4c76-9047-9ccff1adc8ed","Type":"ContainerDied","Data":"092f6781877df5010f350c2f0f0488385a3febc1b925bbeac10d1d9782ca073a"} Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.447016 4909 scope.go:117] "RemoveContainer" containerID="717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.447131 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px8vp" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.494398 4909 scope.go:117] "RemoveContainer" containerID="ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.498563 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-px8vp"] Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.508283 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-px8vp"] Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.526375 4909 scope.go:117] "RemoveContainer" containerID="4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.590177 4909 scope.go:117] "RemoveContainer" containerID="717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe" Oct 02 20:23:44 crc kubenswrapper[4909]: E1002 20:23:44.592199 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe\": container with ID starting with 717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe not found: ID does not exist" containerID="717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.592249 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe"} err="failed to get container status \"717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe\": rpc error: code = NotFound desc = could not find container \"717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe\": container with ID starting with 717c01f903946bbd37ccb912f4d799d07e0d32b7f6073f2dc0b22cb1749b2efe not found: ID does not exist" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.592276 4909 scope.go:117] "RemoveContainer" containerID="ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825" Oct 02 20:23:44 crc kubenswrapper[4909]: E1002 20:23:44.592795 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825\": container with ID starting with ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825 not found: ID does not exist" containerID="ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.592853 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825"} err="failed to get container status \"ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825\": rpc error: code = NotFound desc = could not find container \"ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825\": container with ID starting with ce6bff5c56541974eb3dae826ebde698a9c9f80140dc78970235c674bb31d825 not found: ID does not exist" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.592890 4909 scope.go:117] "RemoveContainer" containerID="4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd" Oct 02 20:23:44 crc kubenswrapper[4909]: E1002 20:23:44.593389 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd\": container with ID starting with 4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd not found: ID does not exist" containerID="4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd" Oct 02 20:23:44 crc kubenswrapper[4909]: I1002 20:23:44.593422 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd"} err="failed to get container status \"4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd\": rpc error: code = NotFound desc = could not find container \"4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd\": container with ID starting with 4d9b9e834239226f80953c00f26e8989e215bdfc5a35df3ca06a25ef41599dfd not found: ID does not exist" Oct 02 20:23:45 crc kubenswrapper[4909]: I1002 20:23:45.620955 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" path="/var/lib/kubelet/pods/ecb963fa-bb63-4c76-9047-9ccff1adc8ed/volumes" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.312515 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d8kgf"] Oct 02 20:24:08 crc kubenswrapper[4909]: E1002 20:24:08.326675 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.326720 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" Oct 02 20:24:08 crc kubenswrapper[4909]: E1002 20:24:08.326762 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="extract-content" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.326769 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="extract-content" Oct 02 20:24:08 crc kubenswrapper[4909]: E1002 20:24:08.326984 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="extract-utilities" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.326996 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="extract-utilities" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.327691 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb963fa-bb63-4c76-9047-9ccff1adc8ed" containerName="registry-server" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.329953 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.343263 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8kgf"] Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.475521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhnnl\" (UniqueName: \"kubernetes.io/projected/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-kube-api-access-bhnnl\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.475582 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-catalog-content\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.475616 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-utilities\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.577352 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhnnl\" (UniqueName: \"kubernetes.io/projected/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-kube-api-access-bhnnl\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.577417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-catalog-content\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.577467 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-utilities\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.578397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-utilities\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.578628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-catalog-content\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.606204 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhnnl\" (UniqueName: \"kubernetes.io/projected/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-kube-api-access-bhnnl\") pod \"certified-operators-d8kgf\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:08 crc kubenswrapper[4909]: I1002 20:24:08.653762 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:11 crc kubenswrapper[4909]: I1002 20:24:11.045077 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8kgf"] Oct 02 20:24:11 crc kubenswrapper[4909]: I1002 20:24:11.851638 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerID="7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081" exitCode=0 Oct 02 20:24:11 crc kubenswrapper[4909]: I1002 20:24:11.851723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerDied","Data":"7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081"} Oct 02 20:24:11 crc kubenswrapper[4909]: I1002 20:24:11.851948 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerStarted","Data":"829aac59571f5d37adcdf88ad75b28652cb4df64e0118b529601b6b172ca02f6"} Oct 02 20:24:12 crc kubenswrapper[4909]: I1002 20:24:12.877294 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerStarted","Data":"d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326"} Oct 02 20:24:14 crc kubenswrapper[4909]: I1002 20:24:14.900741 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerID="d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326" exitCode=0 Oct 02 20:24:14 crc kubenswrapper[4909]: I1002 20:24:14.900819 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerDied","Data":"d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326"} Oct 02 20:24:15 crc kubenswrapper[4909]: I1002 20:24:15.916222 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerStarted","Data":"6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd"} Oct 02 20:24:15 crc kubenswrapper[4909]: I1002 20:24:15.945152 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d8kgf" podStartSLOduration=4.464069663 podStartE2EDuration="7.945133083s" podCreationTimestamp="2025-10-02 20:24:08 +0000 UTC" firstStartedPulling="2025-10-02 20:24:11.853868241 +0000 UTC m=+7573.041364100" lastFinishedPulling="2025-10-02 20:24:15.334931661 +0000 UTC m=+7576.522427520" observedRunningTime="2025-10-02 20:24:15.934958269 +0000 UTC m=+7577.122454138" watchObservedRunningTime="2025-10-02 20:24:15.945133083 +0000 UTC m=+7577.132628942" Oct 02 20:24:18 crc kubenswrapper[4909]: I1002 20:24:18.654830 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:18 crc kubenswrapper[4909]: I1002 20:24:18.655562 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:19 crc kubenswrapper[4909]: I1002 20:24:19.702241 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-d8kgf" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="registry-server" probeResult="failure" output=< Oct 02 20:24:19 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:24:19 crc kubenswrapper[4909]: > Oct 02 20:24:28 crc kubenswrapper[4909]: I1002 20:24:28.710609 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:28 crc kubenswrapper[4909]: I1002 20:24:28.764883 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:28 crc kubenswrapper[4909]: I1002 20:24:28.952441 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8kgf"] Oct 02 20:24:30 crc kubenswrapper[4909]: I1002 20:24:30.075149 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d8kgf" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="registry-server" containerID="cri-o://6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd" gracePeriod=2 Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.668281 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.792199 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-utilities\") pod \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.792628 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhnnl\" (UniqueName: \"kubernetes.io/projected/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-kube-api-access-bhnnl\") pod \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.792776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-catalog-content\") pod \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\" (UID: \"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b\") " Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.793501 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-utilities" (OuterVolumeSpecName: "utilities") pod "4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" (UID: "4a7ba3d6-d7d8-440e-9166-3e2e484ac38b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.798593 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-kube-api-access-bhnnl" (OuterVolumeSpecName: "kube-api-access-bhnnl") pod "4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" (UID: "4a7ba3d6-d7d8-440e-9166-3e2e484ac38b"). InnerVolumeSpecName "kube-api-access-bhnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.873200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" (UID: "4a7ba3d6-d7d8-440e-9166-3e2e484ac38b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.896186 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.896220 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhnnl\" (UniqueName: \"kubernetes.io/projected/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-kube-api-access-bhnnl\") on node \"crc\" DevicePath \"\"" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:30.896238 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.091023 4909 generic.go:334] "Generic (PLEG): container finished" podID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerID="6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd" exitCode=0 Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.091094 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerDied","Data":"6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd"} Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.091127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8kgf" event={"ID":"4a7ba3d6-d7d8-440e-9166-3e2e484ac38b","Type":"ContainerDied","Data":"829aac59571f5d37adcdf88ad75b28652cb4df64e0118b529601b6b172ca02f6"} Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.091148 4909 scope.go:117] "RemoveContainer" containerID="6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.091290 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8kgf" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.148929 4909 scope.go:117] "RemoveContainer" containerID="d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.152801 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8kgf"] Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.192827 4909 scope.go:117] "RemoveContainer" containerID="7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.205070 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d8kgf"] Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.253934 4909 scope.go:117] "RemoveContainer" containerID="6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd" Oct 02 20:24:31 crc kubenswrapper[4909]: E1002 20:24:31.254302 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd\": container with ID starting with 6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd not found: ID does not exist" containerID="6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.254328 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd"} err="failed to get container status \"6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd\": rpc error: code = NotFound desc = could not find container \"6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd\": container with ID starting with 6df5889ff5dc4cf9f1b6bc7c57cd143cac09c1c3ffc8dfc69a6358c3ad1772fd not found: ID does not exist" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.254349 4909 scope.go:117] "RemoveContainer" containerID="d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326" Oct 02 20:24:31 crc kubenswrapper[4909]: E1002 20:24:31.254862 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326\": container with ID starting with d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326 not found: ID does not exist" containerID="d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.254889 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326"} err="failed to get container status \"d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326\": rpc error: code = NotFound desc = could not find container \"d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326\": container with ID starting with d8ea746145a383e586677f6f394cf2d551b6c1d1714f0f3276438900768a6326 not found: ID does not exist" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.254903 4909 scope.go:117] "RemoveContainer" containerID="7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081" Oct 02 20:24:31 crc kubenswrapper[4909]: E1002 20:24:31.255424 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081\": container with ID starting with 7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081 not found: ID does not exist" containerID="7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.255472 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081"} err="failed to get container status \"7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081\": rpc error: code = NotFound desc = could not find container \"7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081\": container with ID starting with 7a33a1610d60570a1682695d9a70ce6938b89e5d2d971de6e8675d551d7e4081 not found: ID does not exist" Oct 02 20:24:31 crc kubenswrapper[4909]: I1002 20:24:31.620512 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" path="/var/lib/kubelet/pods/4a7ba3d6-d7d8-440e-9166-3e2e484ac38b/volumes" Oct 02 20:24:40 crc kubenswrapper[4909]: I1002 20:24:40.695147 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-api/0.log" Oct 02 20:24:40 crc kubenswrapper[4909]: I1002 20:24:40.897360 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-evaluator/0.log" Oct 02 20:24:40 crc kubenswrapper[4909]: I1002 20:24:40.932647 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-listener/0.log" Oct 02 20:24:41 crc kubenswrapper[4909]: I1002 20:24:41.081725 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-notifier/0.log" Oct 02 20:24:41 crc kubenswrapper[4909]: I1002 20:24:41.294988 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b6849dc4b-wmr8p_b4918d43-b740-469f-9ee9-3011c688b622/barbican-api/0.log" Oct 02 20:24:41 crc kubenswrapper[4909]: I1002 20:24:41.351401 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b6849dc4b-wmr8p_b4918d43-b740-469f-9ee9-3011c688b622/barbican-api-log/0.log" Oct 02 20:24:41 crc kubenswrapper[4909]: I1002 20:24:41.544669 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75889c558-g9b2x_efde1707-49d8-4674-9cf8-3cd1de63b5c9/barbican-keystone-listener/0.log" Oct 02 20:24:41 crc kubenswrapper[4909]: I1002 20:24:41.849685 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75889c558-g9b2x_efde1707-49d8-4674-9cf8-3cd1de63b5c9/barbican-keystone-listener-log/0.log" Oct 02 20:24:41 crc kubenswrapper[4909]: I1002 20:24:41.987508 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77d5f8945f-mrwjk_0709099c-cc29-4dbb-9874-ab921318a936/barbican-worker/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.057313 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77d5f8945f-mrwjk_0709099c-cc29-4dbb-9874-ab921318a936/barbican-worker-log/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.269611 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2_50be6a2c-c436-47ba-a5c7-d2cb51151c09/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.523345 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/ceilometer-central-agent/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.671162 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/ceilometer-notification-agent/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.774788 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/proxy-httpd/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.846406 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/sg-core/0.log" Oct 02 20:24:42 crc kubenswrapper[4909]: I1002 20:24:42.948593 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76_9dae9f65-075e-408c-851a-61e7b36a99f7/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.131017 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg_e78721da-178f-402b-a402-3709a89b744e/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.386661 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_237e5a0a-a767-48de-aa6b-bb8cd24fd570/cinder-api-log/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.452241 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_237e5a0a-a767-48de-aa6b-bb8cd24fd570/cinder-api/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.701585 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6d132d6b-b137-48eb-8ccd-39354ec83056/probe/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.751507 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6d132d6b-b137-48eb-8ccd-39354ec83056/cinder-backup/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.915845 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d04288ee-974d-451f-8144-e981667f5115/cinder-scheduler/0.log" Oct 02 20:24:43 crc kubenswrapper[4909]: I1002 20:24:43.945660 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d04288ee-974d-451f-8144-e981667f5115/probe/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.125273 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_2d392c07-e417-4d0d-b301-adf410105519/cinder-volume/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.149842 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_2d392c07-e417-4d0d-b301-adf410105519/probe/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.368967 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9bw67_1ee1ee0b-fca9-4cc7-8016-e8a623cb5955/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.497126 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf_7d34a713-2f9f-4d3d-9341-eb2419b2057d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.584384 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cwz7n_ac29833f-f7cb-4d06-96f8-3f73e527b175/init/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.741441 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cwz7n_ac29833f-f7cb-4d06-96f8-3f73e527b175/init/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.795861 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cwz7n_ac29833f-f7cb-4d06-96f8-3f73e527b175/dnsmasq-dns/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.930044 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d359125d-3d0d-4c5d-bacd-c722f9fe3116/glance-httpd/0.log" Oct 02 20:24:44 crc kubenswrapper[4909]: I1002 20:24:44.976922 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d359125d-3d0d-4c5d-bacd-c722f9fe3116/glance-log/0.log" Oct 02 20:24:45 crc kubenswrapper[4909]: I1002 20:24:45.131550 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d71e63f4-8904-4721-8c48-b66216330fc2/glance-httpd/0.log" Oct 02 20:24:45 crc kubenswrapper[4909]: I1002 20:24:45.158283 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d71e63f4-8904-4721-8c48-b66216330fc2/glance-log/0.log" Oct 02 20:24:45 crc kubenswrapper[4909]: I1002 20:24:45.939005 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6fd97cfc64-gbtcw_b0981006-350f-4a53-85d4-35a0bb5c3eca/heat-engine/0.log" Oct 02 20:24:46 crc kubenswrapper[4909]: I1002 20:24:46.313287 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-85bddbb496-tlzmz_8b922021-6783-4a96-8ee1-a571074d1f49/heat-api/0.log" Oct 02 20:24:46 crc kubenswrapper[4909]: I1002 20:24:46.464515 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c7d49444-5plj4_a982efe8-dc2e-4706-bfd3-0b14dbd266cf/horizon/0.log" Oct 02 20:24:46 crc kubenswrapper[4909]: I1002 20:24:46.467134 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6cc7795987-nsplf_9220d7fd-14fd-44e5-ba47-2b4038b7472f/heat-cfnapi/0.log" Oct 02 20:24:46 crc kubenswrapper[4909]: I1002 20:24:46.689850 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp_db8ea573-a0c2-4c11-9f5b-4122524f6384/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:46 crc kubenswrapper[4909]: I1002 20:24:46.790609 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c7d49444-5plj4_a982efe8-dc2e-4706-bfd3-0b14dbd266cf/horizon-log/0.log" Oct 02 20:24:46 crc kubenswrapper[4909]: I1002 20:24:46.887861 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-c2klm_a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:47 crc kubenswrapper[4909]: I1002 20:24:47.205526 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323861-qs6x8_8c8428d5-6fb6-4e18-84b2-d5f88dd993d7/keystone-cron/0.log" Oct 02 20:24:47 crc kubenswrapper[4909]: I1002 20:24:47.401059 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323921-smc4d_d5fd446f-b5eb-4efa-ae6f-877c30b6321d/keystone-cron/0.log" Oct 02 20:24:47 crc kubenswrapper[4909]: I1002 20:24:47.539056 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8486f6d788-jp6q4_eeeaf88e-e118-47c2-84b3-088443528f41/keystone-api/0.log" Oct 02 20:24:47 crc kubenswrapper[4909]: I1002 20:24:47.575437 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ccc2628c-e957-4344-b396-d3fe3ddd0da1/kube-state-metrics/0.log" Oct 02 20:24:47 crc kubenswrapper[4909]: I1002 20:24:47.744517 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz_e917646f-ee0c-442e-8a71-d637ef36a45e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:47 crc kubenswrapper[4909]: I1002 20:24:47.912160 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-5cjpl_524a4d41-71b9-4a01-8b0a-37c2748a79a2/logging-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.056835 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ba1680d2-9b8d-4c73-bf55-58ad455baa81/manila-api-log/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.186422 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ba1680d2-9b8d-4c73-bf55-58ad455baa81/manila-api/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.352257 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1364e9a8-caf2-48b7-bf59-5405d8c7ce96/manila-scheduler/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.414941 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1364e9a8-caf2-48b7-bf59-5405d8c7ce96/probe/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.588150 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_16f4bb3d-8601-40aa-bef4-026dc559b7a9/manila-share/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.592710 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_16f4bb3d-8601-40aa-bef4-026dc559b7a9/probe/0.log" Oct 02 20:24:48 crc kubenswrapper[4909]: I1002 20:24:48.878286 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_ed14f597-80d1-41ec-a205-85e2e85173c2/mysqld-exporter/0.log" Oct 02 20:24:49 crc kubenswrapper[4909]: I1002 20:24:49.290191 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-849fddf4f5-f44kx_84148aec-ab89-4621-8448-b1bbbf294ad5/neutron-api/0.log" Oct 02 20:24:49 crc kubenswrapper[4909]: I1002 20:24:49.401562 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-849fddf4f5-f44kx_84148aec-ab89-4621-8448-b1bbbf294ad5/neutron-httpd/0.log" Oct 02 20:24:49 crc kubenswrapper[4909]: I1002 20:24:49.615202 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq_c5ea6385-4b7d-43d2-9666-f87c8aeab5e7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:50 crc kubenswrapper[4909]: I1002 20:24:50.407785 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_080b267d-f77b-4b0e-ab73-326a3c9b67b9/nova-api-log/0.log" Oct 02 20:24:50 crc kubenswrapper[4909]: I1002 20:24:50.822769 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_080b267d-f77b-4b0e-ab73-326a3c9b67b9/nova-api-api/0.log" Oct 02 20:24:50 crc kubenswrapper[4909]: I1002 20:24:50.950383 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3a482be6-ee1c-4b4b-a1fe-e05813afe8c1/nova-cell0-conductor-conductor/0.log" Oct 02 20:24:51 crc kubenswrapper[4909]: I1002 20:24:51.276099 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4e10ad5d-5b1e-47a3-a188-50702d492942/nova-cell1-conductor-conductor/0.log" Oct 02 20:24:51 crc kubenswrapper[4909]: I1002 20:24:51.563099 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fe17b6af-3b5c-45b6-b6d3-140b3873c81d/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 20:24:51 crc kubenswrapper[4909]: I1002 20:24:51.819905 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t_d535ce44-b440-434f-8526-3dd777d90ae8/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:52 crc kubenswrapper[4909]: I1002 20:24:52.055665 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_55be370d-35f9-4114-9fd5-48a0b939125a/nova-metadata-log/0.log" Oct 02 20:24:52 crc kubenswrapper[4909]: I1002 20:24:52.657960 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_928ab855-d71e-48e2-bbae-e872154de8bf/nova-scheduler-scheduler/0.log" Oct 02 20:24:53 crc kubenswrapper[4909]: I1002 20:24:53.054693 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:24:53 crc kubenswrapper[4909]: I1002 20:24:53.056797 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:24:53 crc kubenswrapper[4909]: I1002 20:24:53.111690 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fa4c480-f836-44f0-b313-ac6cf9e97262/mysql-bootstrap/0.log" Oct 02 20:24:53 crc kubenswrapper[4909]: I1002 20:24:53.319003 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fa4c480-f836-44f0-b313-ac6cf9e97262/mysql-bootstrap/0.log" Oct 02 20:24:53 crc kubenswrapper[4909]: I1002 20:24:53.515738 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fa4c480-f836-44f0-b313-ac6cf9e97262/galera/0.log" Oct 02 20:24:53 crc kubenswrapper[4909]: I1002 20:24:53.975279 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f308ff37-5d0f-4b6f-8b67-1ab86795e820/mysql-bootstrap/0.log" Oct 02 20:24:54 crc kubenswrapper[4909]: I1002 20:24:54.154182 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f308ff37-5d0f-4b6f-8b67-1ab86795e820/mysql-bootstrap/0.log" Oct 02 20:24:54 crc kubenswrapper[4909]: I1002 20:24:54.368463 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f308ff37-5d0f-4b6f-8b67-1ab86795e820/galera/0.log" Oct 02 20:24:54 crc kubenswrapper[4909]: I1002 20:24:54.754616 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a1aff9fe-ad1a-4056-a64c-8b83abf09d32/openstackclient/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.049249 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_55be370d-35f9-4114-9fd5-48a0b939125a/nova-metadata-metadata/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.164190 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bzwss_b3890e7f-68df-442e-be6c-d6c69309c66d/openstack-network-exporter/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.419533 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovsdb-server-init/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.595597 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovsdb-server-init/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.656329 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovs-vswitchd/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.772818 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovsdb-server/0.log" Oct 02 20:24:55 crc kubenswrapper[4909]: I1002 20:24:55.962430 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sj5pf_9f1ef01b-ae9b-4a58-a411-d2e2e5770742/ovn-controller/0.log" Oct 02 20:24:56 crc kubenswrapper[4909]: I1002 20:24:56.166111 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-s99c9_5aaf1b55-e573-4c4a-b68a-7d9477b5393d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:24:56 crc kubenswrapper[4909]: I1002 20:24:56.580533 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5db23d92-24c2-4ee0-a489-adbb1c9cc04e/openstack-network-exporter/0.log" Oct 02 20:24:56 crc kubenswrapper[4909]: I1002 20:24:56.776472 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5db23d92-24c2-4ee0-a489-adbb1c9cc04e/ovn-northd/0.log" Oct 02 20:24:56 crc kubenswrapper[4909]: I1002 20:24:56.995667 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2e32a55a-2e45-4365-a8cf-3002a0c0ba73/openstack-network-exporter/0.log" Oct 02 20:24:57 crc kubenswrapper[4909]: I1002 20:24:57.103049 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2e32a55a-2e45-4365-a8cf-3002a0c0ba73/ovsdbserver-nb/0.log" Oct 02 20:24:57 crc kubenswrapper[4909]: I1002 20:24:57.274597 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_17ca7d1c-52c8-480e-975a-d22877f0971f/openstack-network-exporter/0.log" Oct 02 20:24:57 crc kubenswrapper[4909]: I1002 20:24:57.456396 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_17ca7d1c-52c8-480e-975a-d22877f0971f/ovsdbserver-sb/0.log" Oct 02 20:24:57 crc kubenswrapper[4909]: I1002 20:24:57.780641 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57bc775dcb-fd69b_af49e65e-4d40-4b78-8219-aa5d209825d0/placement-api/0.log" Oct 02 20:24:57 crc kubenswrapper[4909]: I1002 20:24:57.966211 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57bc775dcb-fd69b_af49e65e-4d40-4b78-8219-aa5d209825d0/placement-log/0.log" Oct 02 20:24:58 crc kubenswrapper[4909]: I1002 20:24:58.154753 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/init-config-reloader/0.log" Oct 02 20:24:58 crc kubenswrapper[4909]: I1002 20:24:58.389379 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/init-config-reloader/0.log" Oct 02 20:24:58 crc kubenswrapper[4909]: I1002 20:24:58.419679 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/config-reloader/0.log" Oct 02 20:24:58 crc kubenswrapper[4909]: I1002 20:24:58.599514 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/prometheus/0.log" Oct 02 20:24:58 crc kubenswrapper[4909]: I1002 20:24:58.652758 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/thanos-sidecar/0.log" Oct 02 20:24:58 crc kubenswrapper[4909]: I1002 20:24:58.889108 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5f627398-76ee-40f8-9c82-47cc58ecb013/setup-container/0.log" Oct 02 20:24:59 crc kubenswrapper[4909]: I1002 20:24:59.056262 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5f627398-76ee-40f8-9c82-47cc58ecb013/setup-container/0.log" Oct 02 20:24:59 crc kubenswrapper[4909]: I1002 20:24:59.123276 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5f627398-76ee-40f8-9c82-47cc58ecb013/rabbitmq/0.log" Oct 02 20:24:59 crc kubenswrapper[4909]: I1002 20:24:59.344576 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fc2c3682-578f-4f96-a535-d35eb31303c6/setup-container/0.log" Oct 02 20:24:59 crc kubenswrapper[4909]: I1002 20:24:59.656639 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fc2c3682-578f-4f96-a535-d35eb31303c6/setup-container/0.log" Oct 02 20:24:59 crc kubenswrapper[4909]: I1002 20:24:59.672495 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fc2c3682-578f-4f96-a535-d35eb31303c6/rabbitmq/0.log" Oct 02 20:24:59 crc kubenswrapper[4909]: I1002 20:24:59.872931 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6_dc36b692-f1b5-4320-9dd0-dbc68b7b5db0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:25:00 crc kubenswrapper[4909]: I1002 20:25:00.175957 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm_c1372353-9e7d-4e84-b8b9-44db5e82f9d1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:25:00 crc kubenswrapper[4909]: I1002 20:25:00.356317 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9m7r7_3b67f438-661f-476b-867c-8d1f4e742634/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:25:00 crc kubenswrapper[4909]: I1002 20:25:00.591531 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qxxtj_50a84617-a1af-4b99-a42d-d72651c450dd/ssh-known-hosts-edpm-deployment/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.050255 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc88f55df-4dlwb_811ebdce-cef0-4178-836b-17bcdd164575/proxy-server/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.222272 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc88f55df-4dlwb_811ebdce-cef0-4178-836b-17bcdd164575/proxy-httpd/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.419211 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-4kkkt_0e9cab60-e149-4118-a3ca-4423620830bf/swift-ring-rebalance/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.648168 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-auditor/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.767908 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-reaper/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.888369 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-replicator/0.log" Oct 02 20:25:01 crc kubenswrapper[4909]: I1002 20:25:01.960901 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-server/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.077721 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-auditor/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.236017 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-replicator/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.329604 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-server/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.475455 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-updater/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.592108 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-auditor/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.733184 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-expirer/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.869682 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-replicator/0.log" Oct 02 20:25:02 crc kubenswrapper[4909]: I1002 20:25:02.974438 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-server/0.log" Oct 02 20:25:03 crc kubenswrapper[4909]: I1002 20:25:03.083658 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-updater/0.log" Oct 02 20:25:03 crc kubenswrapper[4909]: I1002 20:25:03.273760 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/swift-recon-cron/0.log" Oct 02 20:25:03 crc kubenswrapper[4909]: I1002 20:25:03.302972 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/rsync/0.log" Oct 02 20:25:03 crc kubenswrapper[4909]: I1002 20:25:03.580514 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4k427_fc3aeeca-599a-4f61-92d4-9a09ad65206f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:25:03 crc kubenswrapper[4909]: I1002 20:25:03.694879 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24_e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:25:04 crc kubenswrapper[4909]: I1002 20:25:04.074572 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_69cde6e6-c884-4a30-a68b-a584dc3c370d/test-operator-logs-container/0.log" Oct 02 20:25:04 crc kubenswrapper[4909]: I1002 20:25:04.324610 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n_9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:25:04 crc kubenswrapper[4909]: I1002 20:25:04.742981 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5c3787cb-4303-4e47-aa85-ba12c768c729/tempest-tests-tempest-tests-runner/0.log" Oct 02 20:25:07 crc kubenswrapper[4909]: I1002 20:25:07.166939 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f77fb311-1680-4c6f-ac0c-70baa3e89b81/memcached/0.log" Oct 02 20:25:23 crc kubenswrapper[4909]: I1002 20:25:23.054341 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:25:23 crc kubenswrapper[4909]: I1002 20:25:23.054784 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:25:53 crc kubenswrapper[4909]: I1002 20:25:53.054340 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:25:53 crc kubenswrapper[4909]: I1002 20:25:53.054950 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:25:53 crc kubenswrapper[4909]: I1002 20:25:53.055012 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:25:53 crc kubenswrapper[4909]: I1002 20:25:53.056327 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc55a90cc76e5997f77ce636d058ce66c6d73901eb7260d4b6b8cf7f52b9ebe1"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:25:53 crc kubenswrapper[4909]: I1002 20:25:53.056428 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://fc55a90cc76e5997f77ce636d058ce66c6d73901eb7260d4b6b8cf7f52b9ebe1" gracePeriod=600 Oct 02 20:25:54 crc kubenswrapper[4909]: I1002 20:25:54.009083 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="fc55a90cc76e5997f77ce636d058ce66c6d73901eb7260d4b6b8cf7f52b9ebe1" exitCode=0 Oct 02 20:25:54 crc kubenswrapper[4909]: I1002 20:25:54.009293 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"fc55a90cc76e5997f77ce636d058ce66c6d73901eb7260d4b6b8cf7f52b9ebe1"} Oct 02 20:25:54 crc kubenswrapper[4909]: I1002 20:25:54.009436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32"} Oct 02 20:25:54 crc kubenswrapper[4909]: I1002 20:25:54.009460 4909 scope.go:117] "RemoveContainer" containerID="8ccdb7e3db252490b1d5a4700f76a38cc594abd43d28552b0f69f4968a9bf3fc" Oct 02 20:26:00 crc kubenswrapper[4909]: I1002 20:26:00.086235 4909 generic.go:334] "Generic (PLEG): container finished" podID="e9bb2caf-009e-4be2-9691-7780a963b552" containerID="074ebb343af9089d2a8809c8487ec87bede8576b785402ef1f31fb24bbe8529f" exitCode=0 Oct 02 20:26:00 crc kubenswrapper[4909]: I1002 20:26:00.086334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" event={"ID":"e9bb2caf-009e-4be2-9691-7780a963b552","Type":"ContainerDied","Data":"074ebb343af9089d2a8809c8487ec87bede8576b785402ef1f31fb24bbe8529f"} Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.259368 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.305289 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-8xdnp"] Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.316600 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-8xdnp"] Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.336107 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh99h\" (UniqueName: \"kubernetes.io/projected/e9bb2caf-009e-4be2-9691-7780a963b552-kube-api-access-zh99h\") pod \"e9bb2caf-009e-4be2-9691-7780a963b552\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.336196 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9bb2caf-009e-4be2-9691-7780a963b552-host\") pod \"e9bb2caf-009e-4be2-9691-7780a963b552\" (UID: \"e9bb2caf-009e-4be2-9691-7780a963b552\") " Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.336609 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9bb2caf-009e-4be2-9691-7780a963b552-host" (OuterVolumeSpecName: "host") pod "e9bb2caf-009e-4be2-9691-7780a963b552" (UID: "e9bb2caf-009e-4be2-9691-7780a963b552"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.338594 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9bb2caf-009e-4be2-9691-7780a963b552-host\") on node \"crc\" DevicePath \"\"" Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.478340 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bb2caf-009e-4be2-9691-7780a963b552-kube-api-access-zh99h" (OuterVolumeSpecName: "kube-api-access-zh99h") pod "e9bb2caf-009e-4be2-9691-7780a963b552" (UID: "e9bb2caf-009e-4be2-9691-7780a963b552"). InnerVolumeSpecName "kube-api-access-zh99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.546340 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh99h\" (UniqueName: \"kubernetes.io/projected/e9bb2caf-009e-4be2-9691-7780a963b552-kube-api-access-zh99h\") on node \"crc\" DevicePath \"\"" Oct 02 20:26:01 crc kubenswrapper[4909]: I1002 20:26:01.627170 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9bb2caf-009e-4be2-9691-7780a963b552" path="/var/lib/kubelet/pods/e9bb2caf-009e-4be2-9691-7780a963b552/volumes" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.118457 4909 scope.go:117] "RemoveContainer" containerID="074ebb343af9089d2a8809c8487ec87bede8576b785402ef1f31fb24bbe8529f" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.118543 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-8xdnp" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.556299 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-mpgx5"] Oct 02 20:26:02 crc kubenswrapper[4909]: E1002 20:26:02.562289 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="registry-server" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.562371 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="registry-server" Oct 02 20:26:02 crc kubenswrapper[4909]: E1002 20:26:02.562459 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="extract-utilities" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.562479 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="extract-utilities" Oct 02 20:26:02 crc kubenswrapper[4909]: E1002 20:26:02.562558 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bb2caf-009e-4be2-9691-7780a963b552" containerName="container-00" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.562577 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bb2caf-009e-4be2-9691-7780a963b552" containerName="container-00" Oct 02 20:26:02 crc kubenswrapper[4909]: E1002 20:26:02.562689 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="extract-content" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.562738 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="extract-content" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.565189 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7ba3d6-d7d8-440e-9166-3e2e484ac38b" containerName="registry-server" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.565260 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bb2caf-009e-4be2-9691-7780a963b552" containerName="container-00" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.570514 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.679953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61367c5a-70a4-430d-b7c8-5b1a92284dfd-host\") pod \"crc-debug-mpgx5\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.680143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcc7q\" (UniqueName: \"kubernetes.io/projected/61367c5a-70a4-430d-b7c8-5b1a92284dfd-kube-api-access-vcc7q\") pod \"crc-debug-mpgx5\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.782545 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61367c5a-70a4-430d-b7c8-5b1a92284dfd-host\") pod \"crc-debug-mpgx5\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.782644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcc7q\" (UniqueName: \"kubernetes.io/projected/61367c5a-70a4-430d-b7c8-5b1a92284dfd-kube-api-access-vcc7q\") pod \"crc-debug-mpgx5\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.788129 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61367c5a-70a4-430d-b7c8-5b1a92284dfd-host\") pod \"crc-debug-mpgx5\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.825320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcc7q\" (UniqueName: \"kubernetes.io/projected/61367c5a-70a4-430d-b7c8-5b1a92284dfd-kube-api-access-vcc7q\") pod \"crc-debug-mpgx5\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:02 crc kubenswrapper[4909]: I1002 20:26:02.913493 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:03 crc kubenswrapper[4909]: I1002 20:26:03.135535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" event={"ID":"61367c5a-70a4-430d-b7c8-5b1a92284dfd","Type":"ContainerStarted","Data":"107b67ed9560060e8d4f50763c504c8d8b00740f0147b7b20dc3eda6ebb0f58c"} Oct 02 20:26:04 crc kubenswrapper[4909]: I1002 20:26:04.151561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" event={"ID":"61367c5a-70a4-430d-b7c8-5b1a92284dfd","Type":"ContainerStarted","Data":"a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f"} Oct 02 20:26:04 crc kubenswrapper[4909]: I1002 20:26:04.170993 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" podStartSLOduration=2.170965742 podStartE2EDuration="2.170965742s" podCreationTimestamp="2025-10-02 20:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 20:26:04.166669629 +0000 UTC m=+7685.354165558" watchObservedRunningTime="2025-10-02 20:26:04.170965742 +0000 UTC m=+7685.358461631" Oct 02 20:26:04 crc kubenswrapper[4909]: E1002 20:26:04.404611 4909 log.go:32] "ReopenContainerLog from runtime service failed" err="rpc error: code = Unknown desc = container is not running" containerID="a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f" Oct 02 20:26:04 crc kubenswrapper[4909]: E1002 20:26:04.405987 4909 container_log_manager.go:307] "Failed to rotate log for container" err="failed to rotate log \"/var/log/pods/openshift-must-gather-rxh5m_crc-debug-mpgx5_61367c5a-70a4-430d-b7c8-5b1a92284dfd/container-00/0.log\": failed to reopen container log \"a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f\": rpc error: code = Unknown desc = container is not running" worker=1 containerID="a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f" path="/var/log/pods/openshift-must-gather-rxh5m_crc-debug-mpgx5_61367c5a-70a4-430d-b7c8-5b1a92284dfd/container-00/0.log" currentSize=88394158 maxSize=52428800 Oct 02 20:26:05 crc kubenswrapper[4909]: I1002 20:26:05.162780 4909 generic.go:334] "Generic (PLEG): container finished" podID="61367c5a-70a4-430d-b7c8-5b1a92284dfd" containerID="a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f" exitCode=0 Oct 02 20:26:05 crc kubenswrapper[4909]: I1002 20:26:05.162822 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" event={"ID":"61367c5a-70a4-430d-b7c8-5b1a92284dfd","Type":"ContainerDied","Data":"a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f"} Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.292696 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.363319 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61367c5a-70a4-430d-b7c8-5b1a92284dfd-host\") pod \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.363448 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61367c5a-70a4-430d-b7c8-5b1a92284dfd-host" (OuterVolumeSpecName: "host") pod "61367c5a-70a4-430d-b7c8-5b1a92284dfd" (UID: "61367c5a-70a4-430d-b7c8-5b1a92284dfd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.363561 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcc7q\" (UniqueName: \"kubernetes.io/projected/61367c5a-70a4-430d-b7c8-5b1a92284dfd-kube-api-access-vcc7q\") pod \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\" (UID: \"61367c5a-70a4-430d-b7c8-5b1a92284dfd\") " Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.364128 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61367c5a-70a4-430d-b7c8-5b1a92284dfd-host\") on node \"crc\" DevicePath \"\"" Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.368779 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61367c5a-70a4-430d-b7c8-5b1a92284dfd-kube-api-access-vcc7q" (OuterVolumeSpecName: "kube-api-access-vcc7q") pod "61367c5a-70a4-430d-b7c8-5b1a92284dfd" (UID: "61367c5a-70a4-430d-b7c8-5b1a92284dfd"). InnerVolumeSpecName "kube-api-access-vcc7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:26:06 crc kubenswrapper[4909]: I1002 20:26:06.465846 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcc7q\" (UniqueName: \"kubernetes.io/projected/61367c5a-70a4-430d-b7c8-5b1a92284dfd-kube-api-access-vcc7q\") on node \"crc\" DevicePath \"\"" Oct 02 20:26:07 crc kubenswrapper[4909]: I1002 20:26:07.183559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" event={"ID":"61367c5a-70a4-430d-b7c8-5b1a92284dfd","Type":"ContainerDied","Data":"107b67ed9560060e8d4f50763c504c8d8b00740f0147b7b20dc3eda6ebb0f58c"} Oct 02 20:26:07 crc kubenswrapper[4909]: I1002 20:26:07.183623 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-mpgx5" Oct 02 20:26:07 crc kubenswrapper[4909]: I1002 20:26:07.183607 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="107b67ed9560060e8d4f50763c504c8d8b00740f0147b7b20dc3eda6ebb0f58c" Oct 02 20:26:15 crc kubenswrapper[4909]: I1002 20:26:15.114057 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-mpgx5"] Oct 02 20:26:15 crc kubenswrapper[4909]: I1002 20:26:15.124999 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-mpgx5"] Oct 02 20:26:15 crc kubenswrapper[4909]: I1002 20:26:15.620131 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61367c5a-70a4-430d-b7c8-5b1a92284dfd" path="/var/lib/kubelet/pods/61367c5a-70a4-430d-b7c8-5b1a92284dfd/volumes" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.286156 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-4tj4v"] Oct 02 20:26:16 crc kubenswrapper[4909]: E1002 20:26:16.286680 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61367c5a-70a4-430d-b7c8-5b1a92284dfd" containerName="container-00" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.286698 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="61367c5a-70a4-430d-b7c8-5b1a92284dfd" containerName="container-00" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.286924 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="61367c5a-70a4-430d-b7c8-5b1a92284dfd" containerName="container-00" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.287665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.363621 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a37e03-7e9b-4347-a5d5-bbf225556b0c-host\") pod \"crc-debug-4tj4v\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.363763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvj7m\" (UniqueName: \"kubernetes.io/projected/25a37e03-7e9b-4347-a5d5-bbf225556b0c-kube-api-access-bvj7m\") pod \"crc-debug-4tj4v\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.466208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvj7m\" (UniqueName: \"kubernetes.io/projected/25a37e03-7e9b-4347-a5d5-bbf225556b0c-kube-api-access-bvj7m\") pod \"crc-debug-4tj4v\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.466441 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a37e03-7e9b-4347-a5d5-bbf225556b0c-host\") pod \"crc-debug-4tj4v\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.466513 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a37e03-7e9b-4347-a5d5-bbf225556b0c-host\") pod \"crc-debug-4tj4v\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.486574 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvj7m\" (UniqueName: \"kubernetes.io/projected/25a37e03-7e9b-4347-a5d5-bbf225556b0c-kube-api-access-bvj7m\") pod \"crc-debug-4tj4v\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:16 crc kubenswrapper[4909]: I1002 20:26:16.610975 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:17 crc kubenswrapper[4909]: I1002 20:26:17.293124 4909 generic.go:334] "Generic (PLEG): container finished" podID="25a37e03-7e9b-4347-a5d5-bbf225556b0c" containerID="f5deb8e4e2c6fa28be98be06b2d447d37687d0df40d1e331bec2bff4748e65fc" exitCode=0 Oct 02 20:26:17 crc kubenswrapper[4909]: I1002 20:26:17.293299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" event={"ID":"25a37e03-7e9b-4347-a5d5-bbf225556b0c","Type":"ContainerDied","Data":"f5deb8e4e2c6fa28be98be06b2d447d37687d0df40d1e331bec2bff4748e65fc"} Oct 02 20:26:17 crc kubenswrapper[4909]: I1002 20:26:17.293564 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" event={"ID":"25a37e03-7e9b-4347-a5d5-bbf225556b0c","Type":"ContainerStarted","Data":"a7f6784ea033b371d58ebfdbcf58ea7272880c4138a60e1059a9e22f12319ec4"} Oct 02 20:26:17 crc kubenswrapper[4909]: I1002 20:26:17.343416 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-4tj4v"] Oct 02 20:26:17 crc kubenswrapper[4909]: I1002 20:26:17.354935 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rxh5m/crc-debug-4tj4v"] Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.431106 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.513795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a37e03-7e9b-4347-a5d5-bbf225556b0c-host\") pod \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.513916 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvj7m\" (UniqueName: \"kubernetes.io/projected/25a37e03-7e9b-4347-a5d5-bbf225556b0c-kube-api-access-bvj7m\") pod \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\" (UID: \"25a37e03-7e9b-4347-a5d5-bbf225556b0c\") " Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.514173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25a37e03-7e9b-4347-a5d5-bbf225556b0c-host" (OuterVolumeSpecName: "host") pod "25a37e03-7e9b-4347-a5d5-bbf225556b0c" (UID: "25a37e03-7e9b-4347-a5d5-bbf225556b0c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.514636 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a37e03-7e9b-4347-a5d5-bbf225556b0c-host\") on node \"crc\" DevicePath \"\"" Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.538345 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a37e03-7e9b-4347-a5d5-bbf225556b0c-kube-api-access-bvj7m" (OuterVolumeSpecName: "kube-api-access-bvj7m") pod "25a37e03-7e9b-4347-a5d5-bbf225556b0c" (UID: "25a37e03-7e9b-4347-a5d5-bbf225556b0c"). InnerVolumeSpecName "kube-api-access-bvj7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:26:18 crc kubenswrapper[4909]: I1002 20:26:18.617060 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvj7m\" (UniqueName: \"kubernetes.io/projected/25a37e03-7e9b-4347-a5d5-bbf225556b0c-kube-api-access-bvj7m\") on node \"crc\" DevicePath \"\"" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.315039 4909 scope.go:117] "RemoveContainer" containerID="f5deb8e4e2c6fa28be98be06b2d447d37687d0df40d1e331bec2bff4748e65fc" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.315080 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/crc-debug-4tj4v" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.387620 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/util/0.log" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.591304 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/util/0.log" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.610273 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/pull/0.log" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.670221 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a37e03-7e9b-4347-a5d5-bbf225556b0c" path="/var/lib/kubelet/pods/25a37e03-7e9b-4347-a5d5-bbf225556b0c/volumes" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.670337 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/pull/0.log" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.807811 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/pull/0.log" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.886330 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/extract/0.log" Oct 02 20:26:19 crc kubenswrapper[4909]: I1002 20:26:19.904145 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/util/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.016885 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-fz856_848033b4-f22a-4455-8699-45b7761bbee2/kube-rbac-proxy/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.116349 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-2lwdw_c108279d-762f-41ee-a725-55e9f58a8686/kube-rbac-proxy/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.162149 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-fz856_848033b4-f22a-4455-8699-45b7761bbee2/manager/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.277438 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-2lwdw_c108279d-762f-41ee-a725-55e9f58a8686/manager/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.386552 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9jlbp_6f98751d-f257-4445-a739-7ff447f1c3d8/manager/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.389046 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9jlbp_6f98751d-f257-4445-a739-7ff447f1c3d8/kube-rbac-proxy/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.558501 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-87dms_27b610f7-4219-4cb1-97b4-76a4627afc7a/kube-rbac-proxy/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.642136 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-87dms_27b610f7-4219-4cb1-97b4-76a4627afc7a/manager/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.832018 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-28j82_f87d6d15-86ff-46e3-8361-447ce9aff98c/kube-rbac-proxy/0.log" Oct 02 20:26:20 crc kubenswrapper[4909]: I1002 20:26:20.948194 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-28j82_f87d6d15-86ff-46e3-8361-447ce9aff98c/manager/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.017458 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-hkdrb_ba5e23b3-da6b-41ec-9cac-11d4bc3a068b/kube-rbac-proxy/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.070779 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-hkdrb_ba5e23b3-da6b-41ec-9cac-11d4bc3a068b/manager/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.156615 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-bh4kt_d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1/kube-rbac-proxy/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.337944 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-bh4kt_d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1/manager/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.398274 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-886ls_30b85c17-5d08-452e-9099-20e55cc86f9e/kube-rbac-proxy/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.412975 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-886ls_30b85c17-5d08-452e-9099-20e55cc86f9e/manager/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.595015 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-9mjtf_6d4671d7-88da-4834-afb5-5deaf7e84cdb/kube-rbac-proxy/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.704527 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-9mjtf_6d4671d7-88da-4834-afb5-5deaf7e84cdb/manager/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.812839 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-hwgrp_22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff/kube-rbac-proxy/0.log" Oct 02 20:26:21 crc kubenswrapper[4909]: I1002 20:26:21.915752 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-hwgrp_22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff/manager/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.108792 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-57hjm_c2bd7e2c-d50f-4808-85f5-4bae9b91d272/kube-rbac-proxy/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.209505 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-94lx8_5198c9b3-5c15-49de-9f3a-da04c80c4eb3/kube-rbac-proxy/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.245982 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-57hjm_c2bd7e2c-d50f-4808-85f5-4bae9b91d272/manager/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.349633 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-94lx8_5198c9b3-5c15-49de-9f3a-da04c80c4eb3/manager/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.447582 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-zxm4p_18e949b2-6404-43e9-954b-6a09780bf021/kube-rbac-proxy/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.494345 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-zxm4p_18e949b2-6404-43e9-954b-6a09780bf021/manager/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.584988 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-g2lmn_d89c2056-bfda-4177-86e6-bc00964d5f22/kube-rbac-proxy/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.693874 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-g2lmn_d89c2056-bfda-4177-86e6-bc00964d5f22/manager/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.817499 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678btf42_b96ea387-f91f-47ef-ba02-4f61e8a750a3/kube-rbac-proxy/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.827333 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678btf42_b96ea387-f91f-47ef-ba02-4f61e8a750a3/manager/0.log" Oct 02 20:26:22 crc kubenswrapper[4909]: I1002 20:26:22.927221 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7bffff79d9-6k5xz_92aaf752-fa7f-42da-98a1-298dc1f1f745/kube-rbac-proxy/0.log" Oct 02 20:26:23 crc kubenswrapper[4909]: I1002 20:26:23.005860 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c58d4ffff-87ksp_16c56fa3-a7be-4a4c-ba04-967d7a5f1fec/kube-rbac-proxy/0.log" Oct 02 20:26:23 crc kubenswrapper[4909]: I1002 20:26:23.449806 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6mfjw_b138a340-c0d7-436b-8308-14e2ea7a76a9/registry-server/0.log" Oct 02 20:26:23 crc kubenswrapper[4909]: I1002 20:26:23.560409 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c58d4ffff-87ksp_16c56fa3-a7be-4a4c-ba04-967d7a5f1fec/operator/0.log" Oct 02 20:26:23 crc kubenswrapper[4909]: I1002 20:26:23.695750 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-p9sbt_404b32bd-4a96-46d7-b8fa-bcd5fde074aa/manager/0.log" Oct 02 20:26:23 crc kubenswrapper[4909]: I1002 20:26:23.702673 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-p9sbt_404b32bd-4a96-46d7-b8fa-bcd5fde074aa/kube-rbac-proxy/0.log" Oct 02 20:26:23 crc kubenswrapper[4909]: I1002 20:26:23.894587 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-d5vkl_9a100c33-7cb7-4af6-8262-c6255b252462/kube-rbac-proxy/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.014954 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-d5vkl_9a100c33-7cb7-4af6-8262-c6255b252462/manager/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.113637 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf_2b364096-1561-45dd-9c7c-0e20f76360a6/operator/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.265900 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dmsmk_989d2b9b-e976-4e58-a506-5d5755154be0/kube-rbac-proxy/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.314447 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dmsmk_989d2b9b-e976-4e58-a506-5d5755154be0/manager/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.385931 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-769bf6645d-mpcx5_8e79ecec-ac98-42dd-b513-071fbd0e235a/kube-rbac-proxy/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.542265 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mfzz4_59585305-96e5-40d8-b9f0-37d76e01f40f/kube-rbac-proxy/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.585344 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mfzz4_59585305-96e5-40d8-b9f0-37d76e01f40f/manager/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.684334 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7bffff79d9-6k5xz_92aaf752-fa7f-42da-98a1-298dc1f1f745/manager/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.767097 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-8pxwr_40106a12-6dd2-492f-94e0-b3ab9e866b81/kube-rbac-proxy/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.818674 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-8pxwr_40106a12-6dd2-492f-94e0-b3ab9e866b81/manager/0.log" Oct 02 20:26:24 crc kubenswrapper[4909]: I1002 20:26:24.924617 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-769bf6645d-mpcx5_8e79ecec-ac98-42dd-b513-071fbd0e235a/manager/0.log" Oct 02 20:26:42 crc kubenswrapper[4909]: I1002 20:26:42.585355 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h4pvv_a7cc0f59-db36-4e76-93fe-4c99f3e621a0/control-plane-machine-set-operator/0.log" Oct 02 20:26:42 crc kubenswrapper[4909]: I1002 20:26:42.766254 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fgbzj_4bcc6ed7-1c31-4831-8b45-2729aaa5f89c/kube-rbac-proxy/0.log" Oct 02 20:26:42 crc kubenswrapper[4909]: I1002 20:26:42.817605 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fgbzj_4bcc6ed7-1c31-4831-8b45-2729aaa5f89c/machine-api-operator/0.log" Oct 02 20:26:57 crc kubenswrapper[4909]: I1002 20:26:57.158143 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wscv9_69b56b01-8070-4514-8168-51f33b7a2d07/cert-manager-controller/0.log" Oct 02 20:26:57 crc kubenswrapper[4909]: I1002 20:26:57.349667 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-lf4lx_48294402-b72e-4b64-a227-a4ccb355ef9f/cert-manager-cainjector/0.log" Oct 02 20:26:57 crc kubenswrapper[4909]: I1002 20:26:57.352594 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nqkz8_52d78023-98ec-431e-a697-0aab89fc7e8a/cert-manager-webhook/0.log" Oct 02 20:27:10 crc kubenswrapper[4909]: I1002 20:27:10.429766 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-s4vhz_a0702a66-247b-4591-95cf-2ee69ffb5473/nmstate-console-plugin/0.log" Oct 02 20:27:10 crc kubenswrapper[4909]: I1002 20:27:10.657224 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gts4l_f283eb50-d6fb-464c-a075-2e79f4d56305/kube-rbac-proxy/0.log" Oct 02 20:27:10 crc kubenswrapper[4909]: I1002 20:27:10.667960 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gdng9_416821c4-4252-4c41-9e8d-5bf7689aae61/nmstate-handler/0.log" Oct 02 20:27:10 crc kubenswrapper[4909]: I1002 20:27:10.742190 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gts4l_f283eb50-d6fb-464c-a075-2e79f4d56305/nmstate-metrics/0.log" Oct 02 20:27:10 crc kubenswrapper[4909]: I1002 20:27:10.874248 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-vlhrp_2c0fec14-d9a3-4b4d-96dd-a22938d3c736/nmstate-operator/0.log" Oct 02 20:27:10 crc kubenswrapper[4909]: I1002 20:27:10.940436 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-mfztc_e036fd92-1027-4ac0-bde4-1c69c7fa7d4c/nmstate-webhook/0.log" Oct 02 20:27:23 crc kubenswrapper[4909]: I1002 20:27:23.888661 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/kube-rbac-proxy/0.log" Oct 02 20:27:23 crc kubenswrapper[4909]: I1002 20:27:23.899241 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/manager/0.log" Oct 02 20:27:38 crc kubenswrapper[4909]: I1002 20:27:38.467058 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-8958c8b87-fqrjj_63728faa-74d7-4f8c-baab-348f0a26da5b/cluster-logging-operator/0.log" Oct 02 20:27:38 crc kubenswrapper[4909]: I1002 20:27:38.635007 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-t77xg_3903e4b2-91fd-4a38-880e-543863535cf5/collector/0.log" Oct 02 20:27:38 crc kubenswrapper[4909]: I1002 20:27:38.732440 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_90528ba7-f037-4809-959c-26c1a511bb84/loki-compactor/0.log" Oct 02 20:27:38 crc kubenswrapper[4909]: I1002 20:27:38.810227 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-6f5f7fff97-897dd_68d54b28-bf98-4e97-a5f8-9cc1abb31a5d/loki-distributor/0.log" Oct 02 20:27:38 crc kubenswrapper[4909]: I1002 20:27:38.918129 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-5dfzs_1714d70d-b81b-4886-816f-da1588c7364a/gateway/0.log" Oct 02 20:27:38 crc kubenswrapper[4909]: I1002 20:27:38.938679 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-5dfzs_1714d70d-b81b-4886-816f-da1588c7364a/opa/0.log" Oct 02 20:27:39 crc kubenswrapper[4909]: I1002 20:27:39.089953 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-htpg9_42692035-de07-49cc-b2f6-2305a4ff6f31/gateway/0.log" Oct 02 20:27:39 crc kubenswrapper[4909]: I1002 20:27:39.152754 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-htpg9_42692035-de07-49cc-b2f6-2305a4ff6f31/opa/0.log" Oct 02 20:27:39 crc kubenswrapper[4909]: I1002 20:27:39.293818 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_f384831b-fa88-451d-9e4c-3181bd39ed42/loki-index-gateway/0.log" Oct 02 20:27:39 crc kubenswrapper[4909]: I1002 20:27:39.457200 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_3af72dc6-9572-44cd-b4b8-cab3a6857f08/loki-ingester/0.log" Oct 02 20:27:39 crc kubenswrapper[4909]: I1002 20:27:39.552165 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5d954896cf-x8snb_8203105d-afce-403b-8a70-d624672e2826/loki-querier/0.log" Oct 02 20:27:39 crc kubenswrapper[4909]: I1002 20:27:39.709290 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6fbbbc8b7d-pp4hn_1ddf8d24-c907-43af-bb68-ee9a2c28fd67/loki-query-frontend/0.log" Oct 02 20:27:53 crc kubenswrapper[4909]: I1002 20:27:53.054409 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:27:53 crc kubenswrapper[4909]: I1002 20:27:53.054999 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.055824 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-vwbdq_6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9/kube-rbac-proxy/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.338819 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-vwbdq_6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9/controller/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.349957 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.528180 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.538724 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.542303 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.543100 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.763883 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.775275 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.822502 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.837818 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:27:55 crc kubenswrapper[4909]: I1002 20:27:55.978748 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.002123 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.013090 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.036751 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/controller/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.167589 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/frr-metrics/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.245694 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/kube-rbac-proxy-frr/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.257088 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/kube-rbac-proxy/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.386828 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/reloader/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.486058 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-9wd5p_94246755-9836-440f-a7d3-5189dd6d1b6f/frr-k8s-webhook-server/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.782052 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-568bfffc64-vr4t5_04606b24-1344-4bc4-a7b5-c5bd3282afec/manager/0.log" Oct 02 20:27:56 crc kubenswrapper[4909]: I1002 20:27:56.880579 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-f8cdfbb4-bh9bj_9ffcfe84-1e40-4d9f-b411-4f8707346e92/webhook-server/0.log" Oct 02 20:27:57 crc kubenswrapper[4909]: I1002 20:27:57.044939 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x67sq_7da13784-fb30-48c8-8a21-99e151b56645/kube-rbac-proxy/0.log" Oct 02 20:27:57 crc kubenswrapper[4909]: I1002 20:27:57.782302 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x67sq_7da13784-fb30-48c8-8a21-99e151b56645/speaker/0.log" Oct 02 20:27:58 crc kubenswrapper[4909]: I1002 20:27:58.227928 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/frr/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.227888 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/util/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.486544 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/pull/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.531359 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/util/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.534216 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/pull/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.716080 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/util/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.745571 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/extract/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.788203 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/pull/0.log" Oct 02 20:28:12 crc kubenswrapper[4909]: I1002 20:28:12.967390 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/util/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.133500 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/pull/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.139611 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/pull/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.193066 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/util/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.364478 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/util/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.369640 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/pull/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.373209 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/extract/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.549001 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/util/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.749617 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/util/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.802494 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/pull/0.log" Oct 02 20:28:13 crc kubenswrapper[4909]: I1002 20:28:13.829054 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/pull/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.014263 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/util/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.016307 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/pull/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.034487 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/extract/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.169016 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/util/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.420062 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/util/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.433142 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/pull/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.433432 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/pull/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.628275 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/pull/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.630516 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/util/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.630834 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/extract/0.log" Oct 02 20:28:14 crc kubenswrapper[4909]: I1002 20:28:14.819723 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-utilities/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.209560 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-utilities/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.270039 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-content/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.277836 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-content/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.422566 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-utilities/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.428331 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-content/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.619973 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-utilities/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.898602 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-content/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.922892 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-utilities/0.log" Oct 02 20:28:15 crc kubenswrapper[4909]: I1002 20:28:15.964606 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-content/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.157785 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-utilities/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.226850 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-content/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.261689 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/registry-server/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.379929 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/util/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.557385 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/util/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.605855 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/pull/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.608711 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/pull/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.728906 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/util/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.801704 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/registry-server/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.832081 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/extract/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.835834 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/pull/0.log" Oct 02 20:28:16 crc kubenswrapper[4909]: I1002 20:28:16.909960 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zvd2b_e7e3b138-c134-41ec-a5d6-af2e97914045/marketplace-operator/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.007630 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-utilities/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.178204 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-utilities/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.188105 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-content/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.206530 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-content/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.373873 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-content/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.374538 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-utilities/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.559636 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-utilities/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.626887 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/registry-server/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.645630 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-utilities/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.686119 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-content/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.731588 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-content/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.913481 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-utilities/0.log" Oct 02 20:28:17 crc kubenswrapper[4909]: I1002 20:28:17.930746 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-content/0.log" Oct 02 20:28:18 crc kubenswrapper[4909]: I1002 20:28:18.691616 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/registry-server/0.log" Oct 02 20:28:23 crc kubenswrapper[4909]: I1002 20:28:23.054348 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:28:23 crc kubenswrapper[4909]: I1002 20:28:23.055143 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:28:31 crc kubenswrapper[4909]: I1002 20:28:31.319624 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-zhfgm_355b5784-3dc4-4a65-93d8-4bdddd018358/prometheus-operator/0.log" Oct 02 20:28:31 crc kubenswrapper[4909]: I1002 20:28:31.434770 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_109ab413-2eeb-42e1-b39f-feddfe589bcc/prometheus-operator-admission-webhook/0.log" Oct 02 20:28:31 crc kubenswrapper[4909]: I1002 20:28:31.502682 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_38edcfed-aa44-406d-9028-8395eb3ebb06/prometheus-operator-admission-webhook/0.log" Oct 02 20:28:31 crc kubenswrapper[4909]: I1002 20:28:31.641364 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-mwrxx_e4b958c5-5914-4978-884c-e9f42430f52f/operator/0.log" Oct 02 20:28:31 crc kubenswrapper[4909]: I1002 20:28:31.769488 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-6584dc9448-rtc7s_f89fa753-5d7e-4c80-8847-25eeaee0c3e3/observability-ui-dashboards/0.log" Oct 02 20:28:31 crc kubenswrapper[4909]: I1002 20:28:31.854457 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-9btfm_671892a9-6a46-445d-b470-6c16b55b8818/perses-operator/0.log" Oct 02 20:28:44 crc kubenswrapper[4909]: I1002 20:28:44.645555 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/kube-rbac-proxy/0.log" Oct 02 20:28:44 crc kubenswrapper[4909]: I1002 20:28:44.703639 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/manager/0.log" Oct 02 20:28:53 crc kubenswrapper[4909]: I1002 20:28:53.054361 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:28:53 crc kubenswrapper[4909]: I1002 20:28:53.055131 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:28:53 crc kubenswrapper[4909]: I1002 20:28:53.055204 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:28:53 crc kubenswrapper[4909]: I1002 20:28:53.056342 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:28:53 crc kubenswrapper[4909]: I1002 20:28:53.056423 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" gracePeriod=600 Oct 02 20:28:53 crc kubenswrapper[4909]: E1002 20:28:53.189716 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:28:54 crc kubenswrapper[4909]: I1002 20:28:54.021760 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" exitCode=0 Oct 02 20:28:54 crc kubenswrapper[4909]: I1002 20:28:54.021975 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32"} Oct 02 20:28:54 crc kubenswrapper[4909]: I1002 20:28:54.022129 4909 scope.go:117] "RemoveContainer" containerID="fc55a90cc76e5997f77ce636d058ce66c6d73901eb7260d4b6b8cf7f52b9ebe1" Oct 02 20:28:54 crc kubenswrapper[4909]: I1002 20:28:54.022845 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:28:54 crc kubenswrapper[4909]: E1002 20:28:54.023144 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.789922 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7b22d"] Oct 02 20:28:55 crc kubenswrapper[4909]: E1002 20:28:55.790415 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a37e03-7e9b-4347-a5d5-bbf225556b0c" containerName="container-00" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.790427 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a37e03-7e9b-4347-a5d5-bbf225556b0c" containerName="container-00" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.790672 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a37e03-7e9b-4347-a5d5-bbf225556b0c" containerName="container-00" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.792233 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.803009 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7b22d"] Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.901706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-utilities\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.902010 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zg5t\" (UniqueName: \"kubernetes.io/projected/0e0814ba-666f-44a5-ba07-a52041e0d99f-kube-api-access-9zg5t\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:55 crc kubenswrapper[4909]: I1002 20:28:55.902158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-catalog-content\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.004374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-utilities\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.004661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zg5t\" (UniqueName: \"kubernetes.io/projected/0e0814ba-666f-44a5-ba07-a52041e0d99f-kube-api-access-9zg5t\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.004804 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-catalog-content\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.004894 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-utilities\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.005223 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-catalog-content\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.034167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zg5t\" (UniqueName: \"kubernetes.io/projected/0e0814ba-666f-44a5-ba07-a52041e0d99f-kube-api-access-9zg5t\") pod \"redhat-marketplace-7b22d\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.145471 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:28:56 crc kubenswrapper[4909]: I1002 20:28:56.740655 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7b22d"] Oct 02 20:28:57 crc kubenswrapper[4909]: I1002 20:28:57.051118 4909 generic.go:334] "Generic (PLEG): container finished" podID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerID="828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b" exitCode=0 Oct 02 20:28:57 crc kubenswrapper[4909]: I1002 20:28:57.051169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerDied","Data":"828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b"} Oct 02 20:28:57 crc kubenswrapper[4909]: I1002 20:28:57.051418 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerStarted","Data":"354eb50fdf355beb8efc18743eaf61b86e07ab858413b8d12073b2b9f25fed4d"} Oct 02 20:28:57 crc kubenswrapper[4909]: I1002 20:28:57.054533 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:28:58 crc kubenswrapper[4909]: I1002 20:28:58.064973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerStarted","Data":"f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0"} Oct 02 20:28:59 crc kubenswrapper[4909]: I1002 20:28:59.075921 4909 generic.go:334] "Generic (PLEG): container finished" podID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerID="f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0" exitCode=0 Oct 02 20:28:59 crc kubenswrapper[4909]: I1002 20:28:59.076002 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerDied","Data":"f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0"} Oct 02 20:29:00 crc kubenswrapper[4909]: I1002 20:29:00.092343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerStarted","Data":"5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57"} Oct 02 20:29:00 crc kubenswrapper[4909]: I1002 20:29:00.148067 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7b22d" podStartSLOduration=2.563083079 podStartE2EDuration="5.14804066s" podCreationTimestamp="2025-10-02 20:28:55 +0000 UTC" firstStartedPulling="2025-10-02 20:28:57.054223903 +0000 UTC m=+7858.241719762" lastFinishedPulling="2025-10-02 20:28:59.639181484 +0000 UTC m=+7860.826677343" observedRunningTime="2025-10-02 20:29:00.111143308 +0000 UTC m=+7861.298639207" watchObservedRunningTime="2025-10-02 20:29:00.14804066 +0000 UTC m=+7861.335536519" Oct 02 20:29:06 crc kubenswrapper[4909]: I1002 20:29:06.146573 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:29:06 crc kubenswrapper[4909]: I1002 20:29:06.147285 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:29:06 crc kubenswrapper[4909]: I1002 20:29:06.205734 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:29:06 crc kubenswrapper[4909]: I1002 20:29:06.296343 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:29:06 crc kubenswrapper[4909]: I1002 20:29:06.444869 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7b22d"] Oct 02 20:29:06 crc kubenswrapper[4909]: I1002 20:29:06.608850 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:29:06 crc kubenswrapper[4909]: E1002 20:29:06.609246 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.206912 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7b22d" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="registry-server" containerID="cri-o://5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57" gracePeriod=2 Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.758840 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.837530 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zg5t\" (UniqueName: \"kubernetes.io/projected/0e0814ba-666f-44a5-ba07-a52041e0d99f-kube-api-access-9zg5t\") pod \"0e0814ba-666f-44a5-ba07-a52041e0d99f\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.837805 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-utilities\") pod \"0e0814ba-666f-44a5-ba07-a52041e0d99f\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.837876 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-catalog-content\") pod \"0e0814ba-666f-44a5-ba07-a52041e0d99f\" (UID: \"0e0814ba-666f-44a5-ba07-a52041e0d99f\") " Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.839427 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-utilities" (OuterVolumeSpecName: "utilities") pod "0e0814ba-666f-44a5-ba07-a52041e0d99f" (UID: "0e0814ba-666f-44a5-ba07-a52041e0d99f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.844769 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0814ba-666f-44a5-ba07-a52041e0d99f-kube-api-access-9zg5t" (OuterVolumeSpecName: "kube-api-access-9zg5t") pod "0e0814ba-666f-44a5-ba07-a52041e0d99f" (UID: "0e0814ba-666f-44a5-ba07-a52041e0d99f"). InnerVolumeSpecName "kube-api-access-9zg5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.855068 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e0814ba-666f-44a5-ba07-a52041e0d99f" (UID: "0e0814ba-666f-44a5-ba07-a52041e0d99f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.940798 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.940848 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0814ba-666f-44a5-ba07-a52041e0d99f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:29:08 crc kubenswrapper[4909]: I1002 20:29:08.940876 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zg5t\" (UniqueName: \"kubernetes.io/projected/0e0814ba-666f-44a5-ba07-a52041e0d99f-kube-api-access-9zg5t\") on node \"crc\" DevicePath \"\"" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.224152 4909 generic.go:334] "Generic (PLEG): container finished" podID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerID="5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57" exitCode=0 Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.224218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerDied","Data":"5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57"} Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.224229 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7b22d" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.224266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7b22d" event={"ID":"0e0814ba-666f-44a5-ba07-a52041e0d99f","Type":"ContainerDied","Data":"354eb50fdf355beb8efc18743eaf61b86e07ab858413b8d12073b2b9f25fed4d"} Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.224298 4909 scope.go:117] "RemoveContainer" containerID="5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.261896 4909 scope.go:117] "RemoveContainer" containerID="f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.273270 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7b22d"] Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.285834 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7b22d"] Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.291247 4909 scope.go:117] "RemoveContainer" containerID="828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.353384 4909 scope.go:117] "RemoveContainer" containerID="5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57" Oct 02 20:29:09 crc kubenswrapper[4909]: E1002 20:29:09.353819 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57\": container with ID starting with 5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57 not found: ID does not exist" containerID="5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.353845 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57"} err="failed to get container status \"5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57\": rpc error: code = NotFound desc = could not find container \"5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57\": container with ID starting with 5946307fd376d110796de17b497dee02f9bf22fab0520484501fa6cf54292d57 not found: ID does not exist" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.353866 4909 scope.go:117] "RemoveContainer" containerID="f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0" Oct 02 20:29:09 crc kubenswrapper[4909]: E1002 20:29:09.354410 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0\": container with ID starting with f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0 not found: ID does not exist" containerID="f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.354429 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0"} err="failed to get container status \"f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0\": rpc error: code = NotFound desc = could not find container \"f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0\": container with ID starting with f6b09c59b07b913796c37338e9b105e6c7743f14a231b80a12cfc861effc3ee0 not found: ID does not exist" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.354443 4909 scope.go:117] "RemoveContainer" containerID="828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b" Oct 02 20:29:09 crc kubenswrapper[4909]: E1002 20:29:09.354727 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b\": container with ID starting with 828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b not found: ID does not exist" containerID="828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.354747 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b"} err="failed to get container status \"828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b\": rpc error: code = NotFound desc = could not find container \"828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b\": container with ID starting with 828171a473552e66d7d018823eb9f58ac5e797cad047a0ccb6303cb0df3b7e4b not found: ID does not exist" Oct 02 20:29:09 crc kubenswrapper[4909]: I1002 20:29:09.621562 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" path="/var/lib/kubelet/pods/0e0814ba-666f-44a5-ba07-a52041e0d99f/volumes" Oct 02 20:29:20 crc kubenswrapper[4909]: I1002 20:29:20.608932 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:29:20 crc kubenswrapper[4909]: E1002 20:29:20.610070 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:29:33 crc kubenswrapper[4909]: I1002 20:29:33.612704 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:29:33 crc kubenswrapper[4909]: E1002 20:29:33.615946 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:29:47 crc kubenswrapper[4909]: I1002 20:29:47.622306 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:29:47 crc kubenswrapper[4909]: E1002 20:29:47.624605 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.174575 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6"] Oct 02 20:30:00 crc kubenswrapper[4909]: E1002 20:30:00.175500 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="extract-content" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.175512 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="extract-content" Oct 02 20:30:00 crc kubenswrapper[4909]: E1002 20:30:00.175534 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="extract-utilities" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.175540 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="extract-utilities" Oct 02 20:30:00 crc kubenswrapper[4909]: E1002 20:30:00.175567 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="registry-server" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.175572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="registry-server" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.175780 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0814ba-666f-44a5-ba07-a52041e0d99f" containerName="registry-server" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.176559 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.194343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6"] Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.200113 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.200362 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.272012 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0eeca81f-f0f7-4a00-a680-653dbc9fa999-secret-volume\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.272155 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2mz\" (UniqueName: \"kubernetes.io/projected/0eeca81f-f0f7-4a00-a680-653dbc9fa999-kube-api-access-tr2mz\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.272219 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0eeca81f-f0f7-4a00-a680-653dbc9fa999-config-volume\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.374658 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0eeca81f-f0f7-4a00-a680-653dbc9fa999-secret-volume\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.374810 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2mz\" (UniqueName: \"kubernetes.io/projected/0eeca81f-f0f7-4a00-a680-653dbc9fa999-kube-api-access-tr2mz\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.374895 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0eeca81f-f0f7-4a00-a680-653dbc9fa999-config-volume\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.376451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0eeca81f-f0f7-4a00-a680-653dbc9fa999-config-volume\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.383728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0eeca81f-f0f7-4a00-a680-653dbc9fa999-secret-volume\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.399985 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2mz\" (UniqueName: \"kubernetes.io/projected/0eeca81f-f0f7-4a00-a680-653dbc9fa999-kube-api-access-tr2mz\") pod \"collect-profiles-29323950-pqwb6\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.503810 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:00 crc kubenswrapper[4909]: I1002 20:30:00.610952 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:30:00 crc kubenswrapper[4909]: E1002 20:30:00.611654 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:30:01 crc kubenswrapper[4909]: I1002 20:30:01.001700 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6"] Oct 02 20:30:01 crc kubenswrapper[4909]: I1002 20:30:01.921041 4909 generic.go:334] "Generic (PLEG): container finished" podID="0eeca81f-f0f7-4a00-a680-653dbc9fa999" containerID="7c96dfb6f6570834acf0e0c0dd0150bc9f331c9989c2abd54f0769fdd3553c09" exitCode=0 Oct 02 20:30:01 crc kubenswrapper[4909]: I1002 20:30:01.921274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" event={"ID":"0eeca81f-f0f7-4a00-a680-653dbc9fa999","Type":"ContainerDied","Data":"7c96dfb6f6570834acf0e0c0dd0150bc9f331c9989c2abd54f0769fdd3553c09"} Oct 02 20:30:01 crc kubenswrapper[4909]: I1002 20:30:01.921418 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" event={"ID":"0eeca81f-f0f7-4a00-a680-653dbc9fa999","Type":"ContainerStarted","Data":"14ccfb8747726ab285feebdf6ef6b50e9e119a2691ce570fe13739d1a056dcc2"} Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.429432 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.548493 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr2mz\" (UniqueName: \"kubernetes.io/projected/0eeca81f-f0f7-4a00-a680-653dbc9fa999-kube-api-access-tr2mz\") pod \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.548577 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0eeca81f-f0f7-4a00-a680-653dbc9fa999-secret-volume\") pod \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.548616 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0eeca81f-f0f7-4a00-a680-653dbc9fa999-config-volume\") pod \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\" (UID: \"0eeca81f-f0f7-4a00-a680-653dbc9fa999\") " Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.549557 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eeca81f-f0f7-4a00-a680-653dbc9fa999-config-volume" (OuterVolumeSpecName: "config-volume") pod "0eeca81f-f0f7-4a00-a680-653dbc9fa999" (UID: "0eeca81f-f0f7-4a00-a680-653dbc9fa999"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.554827 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eeca81f-f0f7-4a00-a680-653dbc9fa999-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0eeca81f-f0f7-4a00-a680-653dbc9fa999" (UID: "0eeca81f-f0f7-4a00-a680-653dbc9fa999"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.554906 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eeca81f-f0f7-4a00-a680-653dbc9fa999-kube-api-access-tr2mz" (OuterVolumeSpecName: "kube-api-access-tr2mz") pod "0eeca81f-f0f7-4a00-a680-653dbc9fa999" (UID: "0eeca81f-f0f7-4a00-a680-653dbc9fa999"). InnerVolumeSpecName "kube-api-access-tr2mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.651179 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr2mz\" (UniqueName: \"kubernetes.io/projected/0eeca81f-f0f7-4a00-a680-653dbc9fa999-kube-api-access-tr2mz\") on node \"crc\" DevicePath \"\"" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.651214 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0eeca81f-f0f7-4a00-a680-653dbc9fa999-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.651227 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0eeca81f-f0f7-4a00-a680-653dbc9fa999-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.946070 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" event={"ID":"0eeca81f-f0f7-4a00-a680-653dbc9fa999","Type":"ContainerDied","Data":"14ccfb8747726ab285feebdf6ef6b50e9e119a2691ce570fe13739d1a056dcc2"} Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.946553 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ccfb8747726ab285feebdf6ef6b50e9e119a2691ce570fe13739d1a056dcc2" Oct 02 20:30:03 crc kubenswrapper[4909]: I1002 20:30:03.946170 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323950-pqwb6" Oct 02 20:30:04 crc kubenswrapper[4909]: I1002 20:30:04.526540 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g"] Oct 02 20:30:04 crc kubenswrapper[4909]: I1002 20:30:04.541946 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323905-rth9g"] Oct 02 20:30:05 crc kubenswrapper[4909]: I1002 20:30:05.623760 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb10e292-a690-4516-928f-d6d18d85426b" path="/var/lib/kubelet/pods/eb10e292-a690-4516-928f-d6d18d85426b/volumes" Oct 02 20:30:11 crc kubenswrapper[4909]: I1002 20:30:11.610755 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:30:11 crc kubenswrapper[4909]: E1002 20:30:11.611989 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.169801 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mtcks"] Oct 02 20:30:18 crc kubenswrapper[4909]: E1002 20:30:18.171514 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eeca81f-f0f7-4a00-a680-653dbc9fa999" containerName="collect-profiles" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.171534 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eeca81f-f0f7-4a00-a680-653dbc9fa999" containerName="collect-profiles" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.172573 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eeca81f-f0f7-4a00-a680-653dbc9fa999" containerName="collect-profiles" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.175631 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.190255 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtcks"] Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.294018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-catalog-content\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.294467 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dp4v\" (UniqueName: \"kubernetes.io/projected/7155c279-10ae-4c49-89e6-f2d79ed118b1-kube-api-access-8dp4v\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.294551 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-utilities\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.396751 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dp4v\" (UniqueName: \"kubernetes.io/projected/7155c279-10ae-4c49-89e6-f2d79ed118b1-kube-api-access-8dp4v\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.396849 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-utilities\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.396952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-catalog-content\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.397611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-catalog-content\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.397823 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-utilities\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.421511 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dp4v\" (UniqueName: \"kubernetes.io/projected/7155c279-10ae-4c49-89e6-f2d79ed118b1-kube-api-access-8dp4v\") pod \"community-operators-mtcks\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:18 crc kubenswrapper[4909]: I1002 20:30:18.501916 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:19 crc kubenswrapper[4909]: I1002 20:30:19.054545 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtcks"] Oct 02 20:30:19 crc kubenswrapper[4909]: W1002 20:30:19.061166 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7155c279_10ae_4c49_89e6_f2d79ed118b1.slice/crio-9c278efad3882ed72b43b3e2bb0f09653d0e1b8c8840a0846f94f35f55d32133 WatchSource:0}: Error finding container 9c278efad3882ed72b43b3e2bb0f09653d0e1b8c8840a0846f94f35f55d32133: Status 404 returned error can't find the container with id 9c278efad3882ed72b43b3e2bb0f09653d0e1b8c8840a0846f94f35f55d32133 Oct 02 20:30:19 crc kubenswrapper[4909]: I1002 20:30:19.199762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerStarted","Data":"9c278efad3882ed72b43b3e2bb0f09653d0e1b8c8840a0846f94f35f55d32133"} Oct 02 20:30:20 crc kubenswrapper[4909]: I1002 20:30:20.220158 4909 generic.go:334] "Generic (PLEG): container finished" podID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerID="99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861" exitCode=0 Oct 02 20:30:20 crc kubenswrapper[4909]: I1002 20:30:20.220272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerDied","Data":"99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861"} Oct 02 20:30:22 crc kubenswrapper[4909]: I1002 20:30:22.246304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerStarted","Data":"d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd"} Oct 02 20:30:23 crc kubenswrapper[4909]: I1002 20:30:23.263427 4909 generic.go:334] "Generic (PLEG): container finished" podID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerID="d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd" exitCode=0 Oct 02 20:30:23 crc kubenswrapper[4909]: I1002 20:30:23.263494 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerDied","Data":"d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd"} Oct 02 20:30:24 crc kubenswrapper[4909]: I1002 20:30:24.290845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerStarted","Data":"035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc"} Oct 02 20:30:24 crc kubenswrapper[4909]: I1002 20:30:24.308872 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mtcks" podStartSLOduration=2.820883464 podStartE2EDuration="6.308852807s" podCreationTimestamp="2025-10-02 20:30:18 +0000 UTC" firstStartedPulling="2025-10-02 20:30:20.22292882 +0000 UTC m=+7941.410424679" lastFinishedPulling="2025-10-02 20:30:23.710898143 +0000 UTC m=+7944.898394022" observedRunningTime="2025-10-02 20:30:24.307436043 +0000 UTC m=+7945.494931922" watchObservedRunningTime="2025-10-02 20:30:24.308852807 +0000 UTC m=+7945.496348666" Oct 02 20:30:26 crc kubenswrapper[4909]: I1002 20:30:26.608436 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:30:26 crc kubenswrapper[4909]: E1002 20:30:26.609152 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:30:28 crc kubenswrapper[4909]: I1002 20:30:28.502417 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:28 crc kubenswrapper[4909]: I1002 20:30:28.502751 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:29 crc kubenswrapper[4909]: I1002 20:30:29.585206 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mtcks" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="registry-server" probeResult="failure" output=< Oct 02 20:30:29 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:30:29 crc kubenswrapper[4909]: > Oct 02 20:30:30 crc kubenswrapper[4909]: I1002 20:30:30.804188 4909 scope.go:117] "RemoveContainer" containerID="6eb9f48fb84bef14011a8f04a225c0fdc2923990f0517e858d21a2afea0c9622" Oct 02 20:30:38 crc kubenswrapper[4909]: I1002 20:30:38.567842 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:38 crc kubenswrapper[4909]: I1002 20:30:38.642879 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:38 crc kubenswrapper[4909]: I1002 20:30:38.830565 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtcks"] Oct 02 20:30:40 crc kubenswrapper[4909]: I1002 20:30:40.507911 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mtcks" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="registry-server" containerID="cri-o://035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc" gracePeriod=2 Oct 02 20:30:40 crc kubenswrapper[4909]: I1002 20:30:40.608337 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:30:40 crc kubenswrapper[4909]: E1002 20:30:40.608855 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.243350 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.291832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-catalog-content\") pod \"7155c279-10ae-4c49-89e6-f2d79ed118b1\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.292594 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-utilities\") pod \"7155c279-10ae-4c49-89e6-f2d79ed118b1\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.292840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dp4v\" (UniqueName: \"kubernetes.io/projected/7155c279-10ae-4c49-89e6-f2d79ed118b1-kube-api-access-8dp4v\") pod \"7155c279-10ae-4c49-89e6-f2d79ed118b1\" (UID: \"7155c279-10ae-4c49-89e6-f2d79ed118b1\") " Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.294142 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-utilities" (OuterVolumeSpecName: "utilities") pod "7155c279-10ae-4c49-89e6-f2d79ed118b1" (UID: "7155c279-10ae-4c49-89e6-f2d79ed118b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.305270 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7155c279-10ae-4c49-89e6-f2d79ed118b1-kube-api-access-8dp4v" (OuterVolumeSpecName: "kube-api-access-8dp4v") pod "7155c279-10ae-4c49-89e6-f2d79ed118b1" (UID: "7155c279-10ae-4c49-89e6-f2d79ed118b1"). InnerVolumeSpecName "kube-api-access-8dp4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.349276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7155c279-10ae-4c49-89e6-f2d79ed118b1" (UID: "7155c279-10ae-4c49-89e6-f2d79ed118b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.394698 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.394921 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dp4v\" (UniqueName: \"kubernetes.io/projected/7155c279-10ae-4c49-89e6-f2d79ed118b1-kube-api-access-8dp4v\") on node \"crc\" DevicePath \"\"" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.394983 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7155c279-10ae-4c49-89e6-f2d79ed118b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.524972 4909 generic.go:334] "Generic (PLEG): container finished" podID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerID="035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc" exitCode=0 Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.525063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerDied","Data":"035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc"} Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.525115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtcks" event={"ID":"7155c279-10ae-4c49-89e6-f2d79ed118b1","Type":"ContainerDied","Data":"9c278efad3882ed72b43b3e2bb0f09653d0e1b8c8840a0846f94f35f55d32133"} Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.525144 4909 scope.go:117] "RemoveContainer" containerID="035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.525379 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtcks" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.558846 4909 scope.go:117] "RemoveContainer" containerID="d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.586816 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtcks"] Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.604713 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mtcks"] Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.613719 4909 scope.go:117] "RemoveContainer" containerID="99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.625218 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" path="/var/lib/kubelet/pods/7155c279-10ae-4c49-89e6-f2d79ed118b1/volumes" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.698721 4909 scope.go:117] "RemoveContainer" containerID="035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc" Oct 02 20:30:41 crc kubenswrapper[4909]: E1002 20:30:41.700737 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc\": container with ID starting with 035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc not found: ID does not exist" containerID="035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.700798 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc"} err="failed to get container status \"035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc\": rpc error: code = NotFound desc = could not find container \"035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc\": container with ID starting with 035c5a08813eae57e50b347376e6cc4b1908055ac327e73e2754428fe92393bc not found: ID does not exist" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.700826 4909 scope.go:117] "RemoveContainer" containerID="d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd" Oct 02 20:30:41 crc kubenswrapper[4909]: E1002 20:30:41.701598 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd\": container with ID starting with d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd not found: ID does not exist" containerID="d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.701634 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd"} err="failed to get container status \"d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd\": rpc error: code = NotFound desc = could not find container \"d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd\": container with ID starting with d599702c5bf257308cd2fbb1b9bdaeade37e93bd738660634fd0404dd89daefd not found: ID does not exist" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.701655 4909 scope.go:117] "RemoveContainer" containerID="99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861" Oct 02 20:30:41 crc kubenswrapper[4909]: E1002 20:30:41.703706 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861\": container with ID starting with 99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861 not found: ID does not exist" containerID="99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861" Oct 02 20:30:41 crc kubenswrapper[4909]: I1002 20:30:41.703738 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861"} err="failed to get container status \"99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861\": rpc error: code = NotFound desc = could not find container \"99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861\": container with ID starting with 99160fda38dbf95e1a8c7836136b181445f760d328f586d134ae9e95775ed861 not found: ID does not exist" Oct 02 20:30:53 crc kubenswrapper[4909]: I1002 20:30:53.608573 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:30:53 crc kubenswrapper[4909]: E1002 20:30:53.610225 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:31:04 crc kubenswrapper[4909]: E1002 20:31:04.459924 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bbce0e4_7c78_4f5c_a4b3_17fa147db670.slice/crio-64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bbce0e4_7c78_4f5c_a4b3_17fa147db670.slice/crio-conmon-64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6.scope\": RecentStats: unable to find data in memory cache]" Oct 02 20:31:04 crc kubenswrapper[4909]: I1002 20:31:04.866391 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerID="64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6" exitCode=0 Oct 02 20:31:04 crc kubenswrapper[4909]: I1002 20:31:04.866458 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh5m/must-gather-pndst" event={"ID":"6bbce0e4-7c78-4f5c-a4b3-17fa147db670","Type":"ContainerDied","Data":"64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6"} Oct 02 20:31:04 crc kubenswrapper[4909]: I1002 20:31:04.867619 4909 scope.go:117] "RemoveContainer" containerID="64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6" Oct 02 20:31:05 crc kubenswrapper[4909]: I1002 20:31:05.352491 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh5m_must-gather-pndst_6bbce0e4-7c78-4f5c-a4b3-17fa147db670/gather/0.log" Oct 02 20:31:07 crc kubenswrapper[4909]: I1002 20:31:07.608820 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:31:07 crc kubenswrapper[4909]: E1002 20:31:07.609090 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:31:16 crc kubenswrapper[4909]: I1002 20:31:16.694636 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rxh5m/must-gather-pndst"] Oct 02 20:31:16 crc kubenswrapper[4909]: I1002 20:31:16.695908 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rxh5m/must-gather-pndst" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="copy" containerID="cri-o://d482af58ab3304b42829015ef141ce0d16d6182d72ad2e24852414aba5ee642b" gracePeriod=2 Oct 02 20:31:16 crc kubenswrapper[4909]: I1002 20:31:16.707969 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rxh5m/must-gather-pndst"] Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.023089 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh5m_must-gather-pndst_6bbce0e4-7c78-4f5c-a4b3-17fa147db670/copy/0.log" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.023898 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerID="d482af58ab3304b42829015ef141ce0d16d6182d72ad2e24852414aba5ee642b" exitCode=143 Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.261899 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh5m_must-gather-pndst_6bbce0e4-7c78-4f5c-a4b3-17fa147db670/copy/0.log" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.262371 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.289596 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mln5s\" (UniqueName: \"kubernetes.io/projected/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-kube-api-access-mln5s\") pod \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.289781 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-must-gather-output\") pod \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\" (UID: \"6bbce0e4-7c78-4f5c-a4b3-17fa147db670\") " Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.294664 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-kube-api-access-mln5s" (OuterVolumeSpecName: "kube-api-access-mln5s") pod "6bbce0e4-7c78-4f5c-a4b3-17fa147db670" (UID: "6bbce0e4-7c78-4f5c-a4b3-17fa147db670"). InnerVolumeSpecName "kube-api-access-mln5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.392981 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mln5s\" (UniqueName: \"kubernetes.io/projected/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-kube-api-access-mln5s\") on node \"crc\" DevicePath \"\"" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.485601 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6bbce0e4-7c78-4f5c-a4b3-17fa147db670" (UID: "6bbce0e4-7c78-4f5c-a4b3-17fa147db670"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.495368 4909 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bbce0e4-7c78-4f5c-a4b3-17fa147db670-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 20:31:17 crc kubenswrapper[4909]: I1002 20:31:17.623619 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" path="/var/lib/kubelet/pods/6bbce0e4-7c78-4f5c-a4b3-17fa147db670/volumes" Oct 02 20:31:18 crc kubenswrapper[4909]: I1002 20:31:18.034590 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh5m_must-gather-pndst_6bbce0e4-7c78-4f5c-a4b3-17fa147db670/copy/0.log" Oct 02 20:31:18 crc kubenswrapper[4909]: I1002 20:31:18.035191 4909 scope.go:117] "RemoveContainer" containerID="d482af58ab3304b42829015ef141ce0d16d6182d72ad2e24852414aba5ee642b" Oct 02 20:31:18 crc kubenswrapper[4909]: I1002 20:31:18.035335 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh5m/must-gather-pndst" Oct 02 20:31:18 crc kubenswrapper[4909]: I1002 20:31:18.062376 4909 scope.go:117] "RemoveContainer" containerID="64292fe9d83c6ddbc4f50cc5b09cd3694bbc8d349f925e6cc42c04a21cc1ece6" Oct 02 20:31:20 crc kubenswrapper[4909]: I1002 20:31:20.609382 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:31:20 crc kubenswrapper[4909]: E1002 20:31:20.610430 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:31:34 crc kubenswrapper[4909]: I1002 20:31:34.608998 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:31:34 crc kubenswrapper[4909]: E1002 20:31:34.609687 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.963039 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xbqd/must-gather-d9bgq"] Oct 02 20:31:41 crc kubenswrapper[4909]: E1002 20:31:41.963798 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="extract-utilities" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.963813 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="extract-utilities" Oct 02 20:31:41 crc kubenswrapper[4909]: E1002 20:31:41.963852 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="extract-content" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.963859 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="extract-content" Oct 02 20:31:41 crc kubenswrapper[4909]: E1002 20:31:41.963879 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="registry-server" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.963889 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="registry-server" Oct 02 20:31:41 crc kubenswrapper[4909]: E1002 20:31:41.963903 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="copy" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.963911 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="copy" Oct 02 20:31:41 crc kubenswrapper[4909]: E1002 20:31:41.963931 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="gather" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.963941 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="gather" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.964265 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7155c279-10ae-4c49-89e6-f2d79ed118b1" containerName="registry-server" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.964301 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="gather" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.964313 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bbce0e4-7c78-4f5c-a4b3-17fa147db670" containerName="copy" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.965965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.969763 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7xbqd"/"openshift-service-ca.crt" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.969965 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7xbqd"/"kube-root-ca.crt" Oct 02 20:31:41 crc kubenswrapper[4909]: I1002 20:31:41.985809 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xbqd/must-gather-d9bgq"] Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.030318 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dvk6\" (UniqueName: \"kubernetes.io/projected/2f46ef23-b745-4df8-a41b-65046d7a873e-kube-api-access-9dvk6\") pod \"must-gather-d9bgq\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.030384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2f46ef23-b745-4df8-a41b-65046d7a873e-must-gather-output\") pod \"must-gather-d9bgq\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.131933 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dvk6\" (UniqueName: \"kubernetes.io/projected/2f46ef23-b745-4df8-a41b-65046d7a873e-kube-api-access-9dvk6\") pod \"must-gather-d9bgq\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.131992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2f46ef23-b745-4df8-a41b-65046d7a873e-must-gather-output\") pod \"must-gather-d9bgq\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.133379 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2f46ef23-b745-4df8-a41b-65046d7a873e-must-gather-output\") pod \"must-gather-d9bgq\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.162306 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dvk6\" (UniqueName: \"kubernetes.io/projected/2f46ef23-b745-4df8-a41b-65046d7a873e-kube-api-access-9dvk6\") pod \"must-gather-d9bgq\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.297859 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:31:42 crc kubenswrapper[4909]: I1002 20:31:42.793849 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xbqd/must-gather-d9bgq"] Oct 02 20:31:43 crc kubenswrapper[4909]: I1002 20:31:43.377945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" event={"ID":"2f46ef23-b745-4df8-a41b-65046d7a873e","Type":"ContainerStarted","Data":"341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e"} Oct 02 20:31:43 crc kubenswrapper[4909]: I1002 20:31:43.378339 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" event={"ID":"2f46ef23-b745-4df8-a41b-65046d7a873e","Type":"ContainerStarted","Data":"d4a302313eb5622728ab9a3142e1855ccca28d1c01839d4c408b731c2cc4269f"} Oct 02 20:31:44 crc kubenswrapper[4909]: I1002 20:31:44.390942 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" event={"ID":"2f46ef23-b745-4df8-a41b-65046d7a873e","Type":"ContainerStarted","Data":"a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847"} Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.528681 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" podStartSLOduration=6.528659975 podStartE2EDuration="6.528659975s" podCreationTimestamp="2025-10-02 20:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 20:31:44.416105678 +0000 UTC m=+8025.603601527" watchObservedRunningTime="2025-10-02 20:31:47.528659975 +0000 UTC m=+8028.716155834" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.530504 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-kxtgm"] Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.531913 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.534113 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7xbqd"/"default-dockercfg-c26pb" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.675851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw8p2\" (UniqueName: \"kubernetes.io/projected/1ee38b21-8bfa-498c-bc5e-b4011e803c80-kube-api-access-pw8p2\") pod \"crc-debug-kxtgm\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.675921 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee38b21-8bfa-498c-bc5e-b4011e803c80-host\") pod \"crc-debug-kxtgm\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.777463 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee38b21-8bfa-498c-bc5e-b4011e803c80-host\") pod \"crc-debug-kxtgm\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.777761 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw8p2\" (UniqueName: \"kubernetes.io/projected/1ee38b21-8bfa-498c-bc5e-b4011e803c80-kube-api-access-pw8p2\") pod \"crc-debug-kxtgm\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.778001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee38b21-8bfa-498c-bc5e-b4011e803c80-host\") pod \"crc-debug-kxtgm\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.805953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw8p2\" (UniqueName: \"kubernetes.io/projected/1ee38b21-8bfa-498c-bc5e-b4011e803c80-kube-api-access-pw8p2\") pod \"crc-debug-kxtgm\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: I1002 20:31:47.859619 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:31:47 crc kubenswrapper[4909]: W1002 20:31:47.893005 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee38b21_8bfa_498c_bc5e_b4011e803c80.slice/crio-6be7ebc3cf307e66bcb1717111a8bfffd5e81ff9d9d01b5d296fabbb92d7f7cf WatchSource:0}: Error finding container 6be7ebc3cf307e66bcb1717111a8bfffd5e81ff9d9d01b5d296fabbb92d7f7cf: Status 404 returned error can't find the container with id 6be7ebc3cf307e66bcb1717111a8bfffd5e81ff9d9d01b5d296fabbb92d7f7cf Oct 02 20:31:48 crc kubenswrapper[4909]: I1002 20:31:48.431949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" event={"ID":"1ee38b21-8bfa-498c-bc5e-b4011e803c80","Type":"ContainerStarted","Data":"7d11fc1a006b7f7e5be5ea5b05e2eae101094ad94004594fb70dd61424502683"} Oct 02 20:31:48 crc kubenswrapper[4909]: I1002 20:31:48.432230 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" event={"ID":"1ee38b21-8bfa-498c-bc5e-b4011e803c80","Type":"ContainerStarted","Data":"6be7ebc3cf307e66bcb1717111a8bfffd5e81ff9d9d01b5d296fabbb92d7f7cf"} Oct 02 20:31:48 crc kubenswrapper[4909]: I1002 20:31:48.455431 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" podStartSLOduration=1.4554137520000001 podStartE2EDuration="1.455413752s" podCreationTimestamp="2025-10-02 20:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 20:31:48.445749783 +0000 UTC m=+8029.633245642" watchObservedRunningTime="2025-10-02 20:31:48.455413752 +0000 UTC m=+8029.642909611" Oct 02 20:31:49 crc kubenswrapper[4909]: I1002 20:31:49.615818 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:31:49 crc kubenswrapper[4909]: E1002 20:31:49.616712 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:32:02 crc kubenswrapper[4909]: I1002 20:32:02.612380 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:32:02 crc kubenswrapper[4909]: E1002 20:32:02.613308 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:32:16 crc kubenswrapper[4909]: I1002 20:32:16.611252 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:32:16 crc kubenswrapper[4909]: E1002 20:32:16.613089 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:32:27 crc kubenswrapper[4909]: I1002 20:32:27.611692 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:32:27 crc kubenswrapper[4909]: E1002 20:32:27.612414 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:32:30 crc kubenswrapper[4909]: I1002 20:32:30.961499 4909 scope.go:117] "RemoveContainer" containerID="a6ee0d5bd5b19cd2fe7f886ac5d8136ce9d8df80f86a075ad1d41797aee1864f" Oct 02 20:32:42 crc kubenswrapper[4909]: I1002 20:32:42.609268 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:32:42 crc kubenswrapper[4909]: E1002 20:32:42.610195 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:32:55 crc kubenswrapper[4909]: I1002 20:32:55.612980 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:32:55 crc kubenswrapper[4909]: E1002 20:32:55.667149 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:33:10 crc kubenswrapper[4909]: I1002 20:33:10.609209 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:33:10 crc kubenswrapper[4909]: E1002 20:33:10.610407 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.030202 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-api/0.log" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.252646 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-listener/0.log" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.269937 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-evaluator/0.log" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.445387 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_325c0e41-6d5b-4920-8d81-5a161eb3189a/aodh-notifier/0.log" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.677250 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b6849dc4b-wmr8p_b4918d43-b740-469f-9ee9-3011c688b622/barbican-api/0.log" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.709346 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b6849dc4b-wmr8p_b4918d43-b740-469f-9ee9-3011c688b622/barbican-api-log/0.log" Oct 02 20:33:20 crc kubenswrapper[4909]: I1002 20:33:20.891954 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75889c558-g9b2x_efde1707-49d8-4674-9cf8-3cd1de63b5c9/barbican-keystone-listener/0.log" Oct 02 20:33:21 crc kubenswrapper[4909]: I1002 20:33:21.208479 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75889c558-g9b2x_efde1707-49d8-4674-9cf8-3cd1de63b5c9/barbican-keystone-listener-log/0.log" Oct 02 20:33:21 crc kubenswrapper[4909]: I1002 20:33:21.386783 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77d5f8945f-mrwjk_0709099c-cc29-4dbb-9874-ab921318a936/barbican-worker-log/0.log" Oct 02 20:33:21 crc kubenswrapper[4909]: I1002 20:33:21.399977 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77d5f8945f-mrwjk_0709099c-cc29-4dbb-9874-ab921318a936/barbican-worker/0.log" Oct 02 20:33:21 crc kubenswrapper[4909]: I1002 20:33:21.608780 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:33:21 crc kubenswrapper[4909]: E1002 20:33:21.609128 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:33:21 crc kubenswrapper[4909]: I1002 20:33:21.661677 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zfrg2_50be6a2c-c436-47ba-a5c7-d2cb51151c09/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:21 crc kubenswrapper[4909]: I1002 20:33:21.970510 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/ceilometer-central-agent/0.log" Oct 02 20:33:22 crc kubenswrapper[4909]: I1002 20:33:22.363062 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/proxy-httpd/0.log" Oct 02 20:33:22 crc kubenswrapper[4909]: I1002 20:33:22.372520 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/ceilometer-notification-agent/0.log" Oct 02 20:33:22 crc kubenswrapper[4909]: I1002 20:33:22.452933 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5e3dec40-d45b-44d5-858e-72e56c62dfed/sg-core/0.log" Oct 02 20:33:22 crc kubenswrapper[4909]: I1002 20:33:22.650525 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-2vk76_9dae9f65-075e-408c-851a-61e7b36a99f7/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:22 crc kubenswrapper[4909]: I1002 20:33:22.859339 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gvqtg_e78721da-178f-402b-a402-3709a89b744e/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.068426 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_237e5a0a-a767-48de-aa6b-bb8cd24fd570/cinder-api-log/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.082162 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_237e5a0a-a767-48de-aa6b-bb8cd24fd570/cinder-api/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.328753 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6d132d6b-b137-48eb-8ccd-39354ec83056/probe/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.542937 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6d132d6b-b137-48eb-8ccd-39354ec83056/cinder-backup/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.590337 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d04288ee-974d-451f-8144-e981667f5115/cinder-scheduler/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.794115 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d04288ee-974d-451f-8144-e981667f5115/probe/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.881446 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_2d392c07-e417-4d0d-b301-adf410105519/cinder-volume/0.log" Oct 02 20:33:23 crc kubenswrapper[4909]: I1002 20:33:23.983257 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_2d392c07-e417-4d0d-b301-adf410105519/probe/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.061381 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9bw67_1ee1ee0b-fca9-4cc7-8016-e8a623cb5955/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.248231 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-q4gcf_7d34a713-2f9f-4d3d-9341-eb2419b2057d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.419538 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cwz7n_ac29833f-f7cb-4d06-96f8-3f73e527b175/init/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.562409 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cwz7n_ac29833f-f7cb-4d06-96f8-3f73e527b175/init/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.633070 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cwz7n_ac29833f-f7cb-4d06-96f8-3f73e527b175/dnsmasq-dns/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.796546 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d359125d-3d0d-4c5d-bacd-c722f9fe3116/glance-log/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.811285 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d359125d-3d0d-4c5d-bacd-c722f9fe3116/glance-httpd/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.978939 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d71e63f4-8904-4721-8c48-b66216330fc2/glance-httpd/0.log" Oct 02 20:33:24 crc kubenswrapper[4909]: I1002 20:33:24.994508 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d71e63f4-8904-4721-8c48-b66216330fc2/glance-log/0.log" Oct 02 20:33:25 crc kubenswrapper[4909]: I1002 20:33:25.777954 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6fd97cfc64-gbtcw_b0981006-350f-4a53-85d4-35a0bb5c3eca/heat-engine/0.log" Oct 02 20:33:26 crc kubenswrapper[4909]: I1002 20:33:26.150272 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-85bddbb496-tlzmz_8b922021-6783-4a96-8ee1-a571074d1f49/heat-api/0.log" Oct 02 20:33:26 crc kubenswrapper[4909]: I1002 20:33:26.263646 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6cc7795987-nsplf_9220d7fd-14fd-44e5-ba47-2b4038b7472f/heat-cfnapi/0.log" Oct 02 20:33:26 crc kubenswrapper[4909]: I1002 20:33:26.478114 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c7d49444-5plj4_a982efe8-dc2e-4706-bfd3-0b14dbd266cf/horizon/0.log" Oct 02 20:33:26 crc kubenswrapper[4909]: I1002 20:33:26.722483 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c7rxp_db8ea573-a0c2-4c11-9f5b-4122524f6384/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:26 crc kubenswrapper[4909]: I1002 20:33:26.891245 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c7d49444-5plj4_a982efe8-dc2e-4706-bfd3-0b14dbd266cf/horizon-log/0.log" Oct 02 20:33:26 crc kubenswrapper[4909]: I1002 20:33:26.935012 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-c2klm_a2ba2b75-7e2f-43f7-a72e-f5cdad40cc55/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:27 crc kubenswrapper[4909]: I1002 20:33:27.284102 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323861-qs6x8_8c8428d5-6fb6-4e18-84b2-d5f88dd993d7/keystone-cron/0.log" Oct 02 20:33:27 crc kubenswrapper[4909]: I1002 20:33:27.491520 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323921-smc4d_d5fd446f-b5eb-4efa-ae6f-877c30b6321d/keystone-cron/0.log" Oct 02 20:33:27 crc kubenswrapper[4909]: I1002 20:33:27.632447 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8486f6d788-jp6q4_eeeaf88e-e118-47c2-84b3-088443528f41/keystone-api/0.log" Oct 02 20:33:27 crc kubenswrapper[4909]: I1002 20:33:27.743831 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ccc2628c-e957-4344-b396-d3fe3ddd0da1/kube-state-metrics/0.log" Oct 02 20:33:27 crc kubenswrapper[4909]: I1002 20:33:27.948376 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fqqhz_e917646f-ee0c-442e-8a71-d637ef36a45e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.088601 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-5cjpl_524a4d41-71b9-4a01-8b0a-37c2748a79a2/logging-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.353133 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ba1680d2-9b8d-4c73-bf55-58ad455baa81/manila-api-log/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.387273 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ba1680d2-9b8d-4c73-bf55-58ad455baa81/manila-api/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.585281 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1364e9a8-caf2-48b7-bf59-5405d8c7ce96/probe/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.678840 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1364e9a8-caf2-48b7-bf59-5405d8c7ce96/manila-scheduler/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.832630 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_16f4bb3d-8601-40aa-bef4-026dc559b7a9/manila-share/0.log" Oct 02 20:33:28 crc kubenswrapper[4909]: I1002 20:33:28.884976 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_16f4bb3d-8601-40aa-bef4-026dc559b7a9/probe/0.log" Oct 02 20:33:29 crc kubenswrapper[4909]: I1002 20:33:29.160541 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_ed14f597-80d1-41ec-a205-85e2e85173c2/mysqld-exporter/0.log" Oct 02 20:33:29 crc kubenswrapper[4909]: I1002 20:33:29.640141 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-849fddf4f5-f44kx_84148aec-ab89-4621-8448-b1bbbf294ad5/neutron-api/0.log" Oct 02 20:33:29 crc kubenswrapper[4909]: I1002 20:33:29.736779 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-849fddf4f5-f44kx_84148aec-ab89-4621-8448-b1bbbf294ad5/neutron-httpd/0.log" Oct 02 20:33:30 crc kubenswrapper[4909]: I1002 20:33:30.159963 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dsfpq_c5ea6385-4b7d-43d2-9666-f87c8aeab5e7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:31 crc kubenswrapper[4909]: I1002 20:33:31.178789 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_080b267d-f77b-4b0e-ab73-326a3c9b67b9/nova-api-log/0.log" Oct 02 20:33:31 crc kubenswrapper[4909]: I1002 20:33:31.862985 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_080b267d-f77b-4b0e-ab73-326a3c9b67b9/nova-api-api/0.log" Oct 02 20:33:31 crc kubenswrapper[4909]: I1002 20:33:31.911463 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3a482be6-ee1c-4b4b-a1fe-e05813afe8c1/nova-cell0-conductor-conductor/0.log" Oct 02 20:33:32 crc kubenswrapper[4909]: I1002 20:33:32.299095 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4e10ad5d-5b1e-47a3-a188-50702d492942/nova-cell1-conductor-conductor/0.log" Oct 02 20:33:32 crc kubenswrapper[4909]: I1002 20:33:32.725878 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fe17b6af-3b5c-45b6-b6d3-140b3873c81d/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 20:33:32 crc kubenswrapper[4909]: I1002 20:33:32.795519 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-vrq2t_d535ce44-b440-434f-8526-3dd777d90ae8/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:33 crc kubenswrapper[4909]: I1002 20:33:33.016708 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_55be370d-35f9-4114-9fd5-48a0b939125a/nova-metadata-log/0.log" Oct 02 20:33:33 crc kubenswrapper[4909]: I1002 20:33:33.468967 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_928ab855-d71e-48e2-bbae-e872154de8bf/nova-scheduler-scheduler/0.log" Oct 02 20:33:33 crc kubenswrapper[4909]: I1002 20:33:33.698825 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fa4c480-f836-44f0-b313-ac6cf9e97262/mysql-bootstrap/0.log" Oct 02 20:33:33 crc kubenswrapper[4909]: I1002 20:33:33.883292 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fa4c480-f836-44f0-b313-ac6cf9e97262/galera/0.log" Oct 02 20:33:33 crc kubenswrapper[4909]: I1002 20:33:33.892660 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fa4c480-f836-44f0-b313-ac6cf9e97262/mysql-bootstrap/0.log" Oct 02 20:33:34 crc kubenswrapper[4909]: I1002 20:33:34.096045 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f308ff37-5d0f-4b6f-8b67-1ab86795e820/mysql-bootstrap/0.log" Oct 02 20:33:34 crc kubenswrapper[4909]: I1002 20:33:34.374532 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f308ff37-5d0f-4b6f-8b67-1ab86795e820/mysql-bootstrap/0.log" Oct 02 20:33:34 crc kubenswrapper[4909]: I1002 20:33:34.411450 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f308ff37-5d0f-4b6f-8b67-1ab86795e820/galera/0.log" Oct 02 20:33:34 crc kubenswrapper[4909]: I1002 20:33:34.608414 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:33:34 crc kubenswrapper[4909]: E1002 20:33:34.608760 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:33:34 crc kubenswrapper[4909]: I1002 20:33:34.616435 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a1aff9fe-ad1a-4056-a64c-8b83abf09d32/openstackclient/0.log" Oct 02 20:33:34 crc kubenswrapper[4909]: I1002 20:33:34.885842 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bzwss_b3890e7f-68df-442e-be6c-d6c69309c66d/openstack-network-exporter/0.log" Oct 02 20:33:35 crc kubenswrapper[4909]: I1002 20:33:35.120700 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovsdb-server-init/0.log" Oct 02 20:33:35 crc kubenswrapper[4909]: I1002 20:33:35.299712 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovsdb-server-init/0.log" Oct 02 20:33:35 crc kubenswrapper[4909]: I1002 20:33:35.311686 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovs-vswitchd/0.log" Oct 02 20:33:35 crc kubenswrapper[4909]: I1002 20:33:35.494756 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wz8jk_01e45a1f-f70c-4e2f-94ed-763af4c5b5cb/ovsdb-server/0.log" Oct 02 20:33:35 crc kubenswrapper[4909]: I1002 20:33:35.732795 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sj5pf_9f1ef01b-ae9b-4a58-a411-d2e2e5770742/ovn-controller/0.log" Oct 02 20:33:35 crc kubenswrapper[4909]: I1002 20:33:35.990317 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-s99c9_5aaf1b55-e573-4c4a-b68a-7d9477b5393d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.177110 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_55be370d-35f9-4114-9fd5-48a0b939125a/nova-metadata-metadata/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.183855 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5db23d92-24c2-4ee0-a489-adbb1c9cc04e/openstack-network-exporter/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.233451 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5db23d92-24c2-4ee0-a489-adbb1c9cc04e/ovn-northd/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.336390 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2e32a55a-2e45-4365-a8cf-3002a0c0ba73/openstack-network-exporter/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.430919 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2e32a55a-2e45-4365-a8cf-3002a0c0ba73/ovsdbserver-nb/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.643781 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_17ca7d1c-52c8-480e-975a-d22877f0971f/openstack-network-exporter/0.log" Oct 02 20:33:36 crc kubenswrapper[4909]: I1002 20:33:36.678188 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_17ca7d1c-52c8-480e-975a-d22877f0971f/ovsdbserver-sb/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.036411 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57bc775dcb-fd69b_af49e65e-4d40-4b78-8219-aa5d209825d0/placement-api/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.064724 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57bc775dcb-fd69b_af49e65e-4d40-4b78-8219-aa5d209825d0/placement-log/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.220622 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/init-config-reloader/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.412460 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/init-config-reloader/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.457250 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/config-reloader/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.458079 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/prometheus/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.590544 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bcfde8f0-f1c0-41c4-bc7f-8c6c9f7e28a7/thanos-sidecar/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.663575 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5f627398-76ee-40f8-9c82-47cc58ecb013/setup-container/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.885596 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5f627398-76ee-40f8-9c82-47cc58ecb013/setup-container/0.log" Oct 02 20:33:37 crc kubenswrapper[4909]: I1002 20:33:37.915482 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5f627398-76ee-40f8-9c82-47cc58ecb013/rabbitmq/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.143556 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fc2c3682-578f-4f96-a535-d35eb31303c6/setup-container/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.320813 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fc2c3682-578f-4f96-a535-d35eb31303c6/setup-container/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.362051 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fc2c3682-578f-4f96-a535-d35eb31303c6/rabbitmq/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.486588 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-c97k6_dc36b692-f1b5-4320-9dd0-dbc68b7b5db0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.682884 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-twmcm_c1372353-9e7d-4e84-b8b9-44db5e82f9d1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.855009 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9m7r7_3b67f438-661f-476b-867c-8d1f4e742634/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:38 crc kubenswrapper[4909]: I1002 20:33:38.959694 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qxxtj_50a84617-a1af-4b99-a42d-d72651c450dd/ssh-known-hosts-edpm-deployment/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.211608 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc88f55df-4dlwb_811ebdce-cef0-4178-836b-17bcdd164575/proxy-server/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.391154 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc88f55df-4dlwb_811ebdce-cef0-4178-836b-17bcdd164575/proxy-httpd/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.397059 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-4kkkt_0e9cab60-e149-4118-a3ca-4423620830bf/swift-ring-rebalance/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.680558 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-reaper/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.755959 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-auditor/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.885932 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-replicator/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.960523 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-auditor/0.log" Oct 02 20:33:39 crc kubenswrapper[4909]: I1002 20:33:39.962695 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/account-server/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.155269 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-replicator/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.232690 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-updater/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.276017 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/container-server/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.402594 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-auditor/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.476135 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-expirer/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.508524 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-replicator/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.628017 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-server/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.744043 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/object-updater/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.764495 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/rsync/0.log" Oct 02 20:33:40 crc kubenswrapper[4909]: I1002 20:33:40.911618 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc25944b-75d4-4e4e-b1de-57794b5c4bcf/swift-recon-cron/0.log" Oct 02 20:33:41 crc kubenswrapper[4909]: I1002 20:33:41.065613 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4k427_fc3aeeca-599a-4f61-92d4-9a09ad65206f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:41 crc kubenswrapper[4909]: I1002 20:33:41.266530 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-wfx24_e14b9278-9f5b-42fe-b5c2-5cbd6bf165d7/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:41 crc kubenswrapper[4909]: I1002 20:33:41.570846 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_69cde6e6-c884-4a30-a68b-a584dc3c370d/test-operator-logs-container/0.log" Oct 02 20:33:41 crc kubenswrapper[4909]: I1002 20:33:41.780480 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ntq5n_9e5b6fd5-fd08-4243-b59c-ec7e59a6fd20/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 20:33:42 crc kubenswrapper[4909]: I1002 20:33:42.179876 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5c3787cb-4303-4e47-aa85-ba12c768c729/tempest-tests-tempest-tests-runner/0.log" Oct 02 20:33:46 crc kubenswrapper[4909]: I1002 20:33:46.609999 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:33:46 crc kubenswrapper[4909]: E1002 20:33:46.610439 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:33:50 crc kubenswrapper[4909]: I1002 20:33:50.436372 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f77fb311-1680-4c6f-ac0c-70baa3e89b81/memcached/0.log" Oct 02 20:34:00 crc kubenswrapper[4909]: I1002 20:34:00.608550 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:34:01 crc kubenswrapper[4909]: I1002 20:34:01.789820 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"b60f7aa65d898dd00a1ccbd0a0ab4f4c381eb31dafcc4397ea86ae1732a7e36d"} Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.500767 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-79db2"] Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.503941 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.530552 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-79db2"] Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.581431 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-utilities\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.581487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc7x2\" (UniqueName: \"kubernetes.io/projected/48693696-85b8-4dab-ab71-9794add683d7-kube-api-access-cc7x2\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.581665 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-catalog-content\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.683700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-utilities\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.683745 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc7x2\" (UniqueName: \"kubernetes.io/projected/48693696-85b8-4dab-ab71-9794add683d7-kube-api-access-cc7x2\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.683923 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-catalog-content\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.684890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-catalog-content\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.684888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-utilities\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.708646 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc7x2\" (UniqueName: \"kubernetes.io/projected/48693696-85b8-4dab-ab71-9794add683d7-kube-api-access-cc7x2\") pod \"redhat-operators-79db2\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:09 crc kubenswrapper[4909]: I1002 20:34:09.823339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:10 crc kubenswrapper[4909]: I1002 20:34:10.947556 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-79db2"] Oct 02 20:34:11 crc kubenswrapper[4909]: I1002 20:34:11.897262 4909 generic.go:334] "Generic (PLEG): container finished" podID="48693696-85b8-4dab-ab71-9794add683d7" containerID="7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab" exitCode=0 Oct 02 20:34:11 crc kubenswrapper[4909]: I1002 20:34:11.897324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerDied","Data":"7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab"} Oct 02 20:34:11 crc kubenswrapper[4909]: I1002 20:34:11.897647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerStarted","Data":"bf3b4c76b9643cf24ce2ce31e1480c55a91070381e040252da4f1230baaca0f8"} Oct 02 20:34:11 crc kubenswrapper[4909]: I1002 20:34:11.902056 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:34:13 crc kubenswrapper[4909]: I1002 20:34:13.917386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerStarted","Data":"024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090"} Oct 02 20:34:16 crc kubenswrapper[4909]: I1002 20:34:16.964635 4909 generic.go:334] "Generic (PLEG): container finished" podID="48693696-85b8-4dab-ab71-9794add683d7" containerID="024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090" exitCode=0 Oct 02 20:34:16 crc kubenswrapper[4909]: I1002 20:34:16.964694 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerDied","Data":"024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090"} Oct 02 20:34:18 crc kubenswrapper[4909]: I1002 20:34:18.987393 4909 generic.go:334] "Generic (PLEG): container finished" podID="1ee38b21-8bfa-498c-bc5e-b4011e803c80" containerID="7d11fc1a006b7f7e5be5ea5b05e2eae101094ad94004594fb70dd61424502683" exitCode=0 Oct 02 20:34:18 crc kubenswrapper[4909]: I1002 20:34:18.987548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" event={"ID":"1ee38b21-8bfa-498c-bc5e-b4011e803c80","Type":"ContainerDied","Data":"7d11fc1a006b7f7e5be5ea5b05e2eae101094ad94004594fb70dd61424502683"} Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.008900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerStarted","Data":"24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83"} Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.031248 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-79db2" podStartSLOduration=4.189235012 podStartE2EDuration="11.031223555s" podCreationTimestamp="2025-10-02 20:34:09 +0000 UTC" firstStartedPulling="2025-10-02 20:34:11.900111591 +0000 UTC m=+8173.087607460" lastFinishedPulling="2025-10-02 20:34:18.742100134 +0000 UTC m=+8179.929596003" observedRunningTime="2025-10-02 20:34:20.026274001 +0000 UTC m=+8181.213769860" watchObservedRunningTime="2025-10-02 20:34:20.031223555 +0000 UTC m=+8181.218719414" Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.135558 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.174466 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-kxtgm"] Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.237063 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-kxtgm"] Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.247240 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee38b21-8bfa-498c-bc5e-b4011e803c80-host\") pod \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.247533 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee38b21-8bfa-498c-bc5e-b4011e803c80-host" (OuterVolumeSpecName: "host") pod "1ee38b21-8bfa-498c-bc5e-b4011e803c80" (UID: "1ee38b21-8bfa-498c-bc5e-b4011e803c80"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.247566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw8p2\" (UniqueName: \"kubernetes.io/projected/1ee38b21-8bfa-498c-bc5e-b4011e803c80-kube-api-access-pw8p2\") pod \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\" (UID: \"1ee38b21-8bfa-498c-bc5e-b4011e803c80\") " Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.248843 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee38b21-8bfa-498c-bc5e-b4011e803c80-host\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.274903 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee38b21-8bfa-498c-bc5e-b4011e803c80-kube-api-access-pw8p2" (OuterVolumeSpecName: "kube-api-access-pw8p2") pod "1ee38b21-8bfa-498c-bc5e-b4011e803c80" (UID: "1ee38b21-8bfa-498c-bc5e-b4011e803c80"). InnerVolumeSpecName "kube-api-access-pw8p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:34:20 crc kubenswrapper[4909]: I1002 20:34:20.351530 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw8p2\" (UniqueName: \"kubernetes.io/projected/1ee38b21-8bfa-498c-bc5e-b4011e803c80-kube-api-access-pw8p2\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.020544 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-kxtgm" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.020461 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be7ebc3cf307e66bcb1717111a8bfffd5e81ff9d9d01b5d296fabbb92d7f7cf" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.435172 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-pcdm7"] Oct 02 20:34:21 crc kubenswrapper[4909]: E1002 20:34:21.435904 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee38b21-8bfa-498c-bc5e-b4011e803c80" containerName="container-00" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.435922 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee38b21-8bfa-498c-bc5e-b4011e803c80" containerName="container-00" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.436168 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee38b21-8bfa-498c-bc5e-b4011e803c80" containerName="container-00" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.436882 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.442908 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7xbqd"/"default-dockercfg-c26pb" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.578975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/523d0e30-a443-4c7f-891d-2a76d525a2ff-host\") pod \"crc-debug-pcdm7\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.579258 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm6df\" (UniqueName: \"kubernetes.io/projected/523d0e30-a443-4c7f-891d-2a76d525a2ff-kube-api-access-lm6df\") pod \"crc-debug-pcdm7\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.621185 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee38b21-8bfa-498c-bc5e-b4011e803c80" path="/var/lib/kubelet/pods/1ee38b21-8bfa-498c-bc5e-b4011e803c80/volumes" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.681936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/523d0e30-a443-4c7f-891d-2a76d525a2ff-host\") pod \"crc-debug-pcdm7\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.682107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm6df\" (UniqueName: \"kubernetes.io/projected/523d0e30-a443-4c7f-891d-2a76d525a2ff-kube-api-access-lm6df\") pod \"crc-debug-pcdm7\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.684289 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/523d0e30-a443-4c7f-891d-2a76d525a2ff-host\") pod \"crc-debug-pcdm7\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.706633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm6df\" (UniqueName: \"kubernetes.io/projected/523d0e30-a443-4c7f-891d-2a76d525a2ff-kube-api-access-lm6df\") pod \"crc-debug-pcdm7\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: I1002 20:34:21.756664 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:21 crc kubenswrapper[4909]: W1002 20:34:21.804370 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod523d0e30_a443_4c7f_891d_2a76d525a2ff.slice/crio-de504e63612c4aadc9fd68dbcfc5e71b9410913864117da7a618fd6919c13d9c WatchSource:0}: Error finding container de504e63612c4aadc9fd68dbcfc5e71b9410913864117da7a618fd6919c13d9c: Status 404 returned error can't find the container with id de504e63612c4aadc9fd68dbcfc5e71b9410913864117da7a618fd6919c13d9c Oct 02 20:34:22 crc kubenswrapper[4909]: I1002 20:34:22.033476 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" event={"ID":"523d0e30-a443-4c7f-891d-2a76d525a2ff","Type":"ContainerStarted","Data":"de504e63612c4aadc9fd68dbcfc5e71b9410913864117da7a618fd6919c13d9c"} Oct 02 20:34:23 crc kubenswrapper[4909]: I1002 20:34:23.043071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" event={"ID":"523d0e30-a443-4c7f-891d-2a76d525a2ff","Type":"ContainerStarted","Data":"078ae825d31084f0fd5758fa7c65edce3e056e7ecadae749d408dedbd8dcf582"} Oct 02 20:34:23 crc kubenswrapper[4909]: I1002 20:34:23.067230 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" podStartSLOduration=2.067212902 podStartE2EDuration="2.067212902s" podCreationTimestamp="2025-10-02 20:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 20:34:23.064946612 +0000 UTC m=+8184.252442471" watchObservedRunningTime="2025-10-02 20:34:23.067212902 +0000 UTC m=+8184.254708761" Oct 02 20:34:25 crc kubenswrapper[4909]: I1002 20:34:25.063257 4909 generic.go:334] "Generic (PLEG): container finished" podID="523d0e30-a443-4c7f-891d-2a76d525a2ff" containerID="078ae825d31084f0fd5758fa7c65edce3e056e7ecadae749d408dedbd8dcf582" exitCode=0 Oct 02 20:34:25 crc kubenswrapper[4909]: I1002 20:34:25.063343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" event={"ID":"523d0e30-a443-4c7f-891d-2a76d525a2ff","Type":"ContainerDied","Data":"078ae825d31084f0fd5758fa7c65edce3e056e7ecadae749d408dedbd8dcf582"} Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.197514 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.280333 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm6df\" (UniqueName: \"kubernetes.io/projected/523d0e30-a443-4c7f-891d-2a76d525a2ff-kube-api-access-lm6df\") pod \"523d0e30-a443-4c7f-891d-2a76d525a2ff\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.280649 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/523d0e30-a443-4c7f-891d-2a76d525a2ff-host\") pod \"523d0e30-a443-4c7f-891d-2a76d525a2ff\" (UID: \"523d0e30-a443-4c7f-891d-2a76d525a2ff\") " Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.281188 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/523d0e30-a443-4c7f-891d-2a76d525a2ff-host" (OuterVolumeSpecName: "host") pod "523d0e30-a443-4c7f-891d-2a76d525a2ff" (UID: "523d0e30-a443-4c7f-891d-2a76d525a2ff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.289245 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523d0e30-a443-4c7f-891d-2a76d525a2ff-kube-api-access-lm6df" (OuterVolumeSpecName: "kube-api-access-lm6df") pod "523d0e30-a443-4c7f-891d-2a76d525a2ff" (UID: "523d0e30-a443-4c7f-891d-2a76d525a2ff"). InnerVolumeSpecName "kube-api-access-lm6df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.383161 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/523d0e30-a443-4c7f-891d-2a76d525a2ff-host\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:26 crc kubenswrapper[4909]: I1002 20:34:26.383199 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm6df\" (UniqueName: \"kubernetes.io/projected/523d0e30-a443-4c7f-891d-2a76d525a2ff-kube-api-access-lm6df\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:27 crc kubenswrapper[4909]: I1002 20:34:27.084213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" event={"ID":"523d0e30-a443-4c7f-891d-2a76d525a2ff","Type":"ContainerDied","Data":"de504e63612c4aadc9fd68dbcfc5e71b9410913864117da7a618fd6919c13d9c"} Oct 02 20:34:27 crc kubenswrapper[4909]: I1002 20:34:27.084449 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de504e63612c4aadc9fd68dbcfc5e71b9410913864117da7a618fd6919c13d9c" Oct 02 20:34:27 crc kubenswrapper[4909]: I1002 20:34:27.084277 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-pcdm7" Oct 02 20:34:29 crc kubenswrapper[4909]: I1002 20:34:29.823518 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:29 crc kubenswrapper[4909]: I1002 20:34:29.823573 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:30 crc kubenswrapper[4909]: I1002 20:34:30.882770 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-79db2" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="registry-server" probeResult="failure" output=< Oct 02 20:34:30 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:34:30 crc kubenswrapper[4909]: > Oct 02 20:34:33 crc kubenswrapper[4909]: I1002 20:34:33.602406 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-pcdm7"] Oct 02 20:34:33 crc kubenswrapper[4909]: I1002 20:34:33.623072 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-pcdm7"] Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.801606 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-hnnh9"] Oct 02 20:34:34 crc kubenswrapper[4909]: E1002 20:34:34.802326 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523d0e30-a443-4c7f-891d-2a76d525a2ff" containerName="container-00" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.802339 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="523d0e30-a443-4c7f-891d-2a76d525a2ff" containerName="container-00" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.802737 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="523d0e30-a443-4c7f-891d-2a76d525a2ff" containerName="container-00" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.803625 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.806879 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7xbqd"/"default-dockercfg-c26pb" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.858494 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqc6\" (UniqueName: \"kubernetes.io/projected/682139e2-ad94-40b3-beef-8fbfb2e9571f-kube-api-access-8wqc6\") pod \"crc-debug-hnnh9\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.858580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/682139e2-ad94-40b3-beef-8fbfb2e9571f-host\") pod \"crc-debug-hnnh9\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.961417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqc6\" (UniqueName: \"kubernetes.io/projected/682139e2-ad94-40b3-beef-8fbfb2e9571f-kube-api-access-8wqc6\") pod \"crc-debug-hnnh9\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.961480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/682139e2-ad94-40b3-beef-8fbfb2e9571f-host\") pod \"crc-debug-hnnh9\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.961633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/682139e2-ad94-40b3-beef-8fbfb2e9571f-host\") pod \"crc-debug-hnnh9\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:34 crc kubenswrapper[4909]: I1002 20:34:34.989933 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqc6\" (UniqueName: \"kubernetes.io/projected/682139e2-ad94-40b3-beef-8fbfb2e9571f-kube-api-access-8wqc6\") pod \"crc-debug-hnnh9\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:35 crc kubenswrapper[4909]: I1002 20:34:35.121651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:35 crc kubenswrapper[4909]: W1002 20:34:35.163927 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod682139e2_ad94_40b3_beef_8fbfb2e9571f.slice/crio-4067be2065b8678fdd2d9d0793810c622da8310361d4a8f2e651614612e13c21 WatchSource:0}: Error finding container 4067be2065b8678fdd2d9d0793810c622da8310361d4a8f2e651614612e13c21: Status 404 returned error can't find the container with id 4067be2065b8678fdd2d9d0793810c622da8310361d4a8f2e651614612e13c21 Oct 02 20:34:35 crc kubenswrapper[4909]: I1002 20:34:35.184539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" event={"ID":"682139e2-ad94-40b3-beef-8fbfb2e9571f","Type":"ContainerStarted","Data":"4067be2065b8678fdd2d9d0793810c622da8310361d4a8f2e651614612e13c21"} Oct 02 20:34:35 crc kubenswrapper[4909]: I1002 20:34:35.626856 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523d0e30-a443-4c7f-891d-2a76d525a2ff" path="/var/lib/kubelet/pods/523d0e30-a443-4c7f-891d-2a76d525a2ff/volumes" Oct 02 20:34:36 crc kubenswrapper[4909]: I1002 20:34:36.194853 4909 generic.go:334] "Generic (PLEG): container finished" podID="682139e2-ad94-40b3-beef-8fbfb2e9571f" containerID="f98bc9bc2c9683ef166e3a87b3b7418fa3cb7bcdfe522db5b4c697a3528a06eb" exitCode=0 Oct 02 20:34:36 crc kubenswrapper[4909]: I1002 20:34:36.194899 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" event={"ID":"682139e2-ad94-40b3-beef-8fbfb2e9571f","Type":"ContainerDied","Data":"f98bc9bc2c9683ef166e3a87b3b7418fa3cb7bcdfe522db5b4c697a3528a06eb"} Oct 02 20:34:36 crc kubenswrapper[4909]: I1002 20:34:36.239472 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-hnnh9"] Oct 02 20:34:36 crc kubenswrapper[4909]: I1002 20:34:36.252052 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xbqd/crc-debug-hnnh9"] Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.316424 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.416701 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqc6\" (UniqueName: \"kubernetes.io/projected/682139e2-ad94-40b3-beef-8fbfb2e9571f-kube-api-access-8wqc6\") pod \"682139e2-ad94-40b3-beef-8fbfb2e9571f\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.416947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/682139e2-ad94-40b3-beef-8fbfb2e9571f-host\") pod \"682139e2-ad94-40b3-beef-8fbfb2e9571f\" (UID: \"682139e2-ad94-40b3-beef-8fbfb2e9571f\") " Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.417125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/682139e2-ad94-40b3-beef-8fbfb2e9571f-host" (OuterVolumeSpecName: "host") pod "682139e2-ad94-40b3-beef-8fbfb2e9571f" (UID: "682139e2-ad94-40b3-beef-8fbfb2e9571f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.417977 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/682139e2-ad94-40b3-beef-8fbfb2e9571f-host\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.422199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682139e2-ad94-40b3-beef-8fbfb2e9571f-kube-api-access-8wqc6" (OuterVolumeSpecName: "kube-api-access-8wqc6") pod "682139e2-ad94-40b3-beef-8fbfb2e9571f" (UID: "682139e2-ad94-40b3-beef-8fbfb2e9571f"). InnerVolumeSpecName "kube-api-access-8wqc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.520131 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqc6\" (UniqueName: \"kubernetes.io/projected/682139e2-ad94-40b3-beef-8fbfb2e9571f-kube-api-access-8wqc6\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:37 crc kubenswrapper[4909]: I1002 20:34:37.627717 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682139e2-ad94-40b3-beef-8fbfb2e9571f" path="/var/lib/kubelet/pods/682139e2-ad94-40b3-beef-8fbfb2e9571f/volumes" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.004525 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/util/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.172391 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/pull/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.185689 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/util/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.200560 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/pull/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.218256 4909 scope.go:117] "RemoveContainer" containerID="f98bc9bc2c9683ef166e3a87b3b7418fa3cb7bcdfe522db5b4c697a3528a06eb" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.218304 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/crc-debug-hnnh9" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.383914 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/extract/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.397569 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/util/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.463411 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_393b9ba3590b2be8fa56ff3698eeba17e90ce9494dba409a4ddd8437f5gh7pp_7adcf953-dc05-488e-a463-7a0574cd88bc/pull/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.567939 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-fz856_848033b4-f22a-4455-8699-45b7761bbee2/kube-rbac-proxy/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.684429 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-fz856_848033b4-f22a-4455-8699-45b7761bbee2/manager/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.716724 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-2lwdw_c108279d-762f-41ee-a725-55e9f58a8686/kube-rbac-proxy/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.828636 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-2lwdw_c108279d-762f-41ee-a725-55e9f58a8686/manager/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.923694 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9jlbp_6f98751d-f257-4445-a739-7ff447f1c3d8/kube-rbac-proxy/0.log" Oct 02 20:34:38 crc kubenswrapper[4909]: I1002 20:34:38.965294 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9jlbp_6f98751d-f257-4445-a739-7ff447f1c3d8/manager/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.175157 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-87dms_27b610f7-4219-4cb1-97b4-76a4627afc7a/kube-rbac-proxy/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.238021 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-87dms_27b610f7-4219-4cb1-97b4-76a4627afc7a/manager/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.283188 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-28j82_f87d6d15-86ff-46e3-8361-447ce9aff98c/kube-rbac-proxy/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.494086 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-28j82_f87d6d15-86ff-46e3-8361-447ce9aff98c/manager/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.530739 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-hkdrb_ba5e23b3-da6b-41ec-9cac-11d4bc3a068b/manager/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.538137 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-hkdrb_ba5e23b3-da6b-41ec-9cac-11d4bc3a068b/kube-rbac-proxy/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.726926 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-bh4kt_d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1/kube-rbac-proxy/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.891012 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-886ls_30b85c17-5d08-452e-9099-20e55cc86f9e/kube-rbac-proxy/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.892123 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-bh4kt_d27e9c1d-e4ec-432f-a2f1-a27cf88ed6e1/manager/0.log" Oct 02 20:34:39 crc kubenswrapper[4909]: I1002 20:34:39.973234 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-886ls_30b85c17-5d08-452e-9099-20e55cc86f9e/manager/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.106406 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-9mjtf_6d4671d7-88da-4834-afb5-5deaf7e84cdb/kube-rbac-proxy/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.189156 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-9mjtf_6d4671d7-88da-4834-afb5-5deaf7e84cdb/manager/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.254446 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-hwgrp_22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff/kube-rbac-proxy/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.351343 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-hwgrp_22f2bd1c-63fc-4d4f-8d05-890fdb02a2ff/manager/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.447688 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-57hjm_c2bd7e2c-d50f-4808-85f5-4bae9b91d272/kube-rbac-proxy/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.467074 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-57hjm_c2bd7e2c-d50f-4808-85f5-4bae9b91d272/manager/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.606288 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-94lx8_5198c9b3-5c15-49de-9f3a-da04c80c4eb3/kube-rbac-proxy/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.711721 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-94lx8_5198c9b3-5c15-49de-9f3a-da04c80c4eb3/manager/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.793354 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-zxm4p_18e949b2-6404-43e9-954b-6a09780bf021/kube-rbac-proxy/0.log" Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.875261 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-79db2" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="registry-server" probeResult="failure" output=< Oct 02 20:34:40 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:34:40 crc kubenswrapper[4909]: > Oct 02 20:34:40 crc kubenswrapper[4909]: I1002 20:34:40.933803 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-zxm4p_18e949b2-6404-43e9-954b-6a09780bf021/manager/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.085818 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-g2lmn_d89c2056-bfda-4177-86e6-bc00964d5f22/kube-rbac-proxy/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.126677 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-g2lmn_d89c2056-bfda-4177-86e6-bc00964d5f22/manager/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.233970 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678btf42_b96ea387-f91f-47ef-ba02-4f61e8a750a3/kube-rbac-proxy/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.282954 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678btf42_b96ea387-f91f-47ef-ba02-4f61e8a750a3/manager/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.363006 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7bffff79d9-6k5xz_92aaf752-fa7f-42da-98a1-298dc1f1f745/kube-rbac-proxy/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.761381 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c58d4ffff-87ksp_16c56fa3-a7be-4a4c-ba04-967d7a5f1fec/kube-rbac-proxy/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.974135 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6mfjw_b138a340-c0d7-436b-8308-14e2ea7a76a9/registry-server/0.log" Oct 02 20:34:41 crc kubenswrapper[4909]: I1002 20:34:41.978612 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c58d4ffff-87ksp_16c56fa3-a7be-4a4c-ba04-967d7a5f1fec/operator/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.282492 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-p9sbt_404b32bd-4a96-46d7-b8fa-bcd5fde074aa/kube-rbac-proxy/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.384840 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-d5vkl_9a100c33-7cb7-4af6-8262-c6255b252462/kube-rbac-proxy/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.405327 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-p9sbt_404b32bd-4a96-46d7-b8fa-bcd5fde074aa/manager/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.563088 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-d5vkl_9a100c33-7cb7-4af6-8262-c6255b252462/manager/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.713606 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-8lmtf_2b364096-1561-45dd-9c7c-0e20f76360a6/operator/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.792312 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dmsmk_989d2b9b-e976-4e58-a506-5d5755154be0/kube-rbac-proxy/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.872998 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-dmsmk_989d2b9b-e976-4e58-a506-5d5755154be0/manager/0.log" Oct 02 20:34:42 crc kubenswrapper[4909]: I1002 20:34:42.993492 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-769bf6645d-mpcx5_8e79ecec-ac98-42dd-b513-071fbd0e235a/kube-rbac-proxy/0.log" Oct 02 20:34:43 crc kubenswrapper[4909]: I1002 20:34:43.153723 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7bffff79d9-6k5xz_92aaf752-fa7f-42da-98a1-298dc1f1f745/manager/0.log" Oct 02 20:34:43 crc kubenswrapper[4909]: I1002 20:34:43.220401 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mfzz4_59585305-96e5-40d8-b9f0-37d76e01f40f/kube-rbac-proxy/0.log" Oct 02 20:34:43 crc kubenswrapper[4909]: I1002 20:34:43.331721 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mfzz4_59585305-96e5-40d8-b9f0-37d76e01f40f/manager/0.log" Oct 02 20:34:43 crc kubenswrapper[4909]: I1002 20:34:43.412104 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-8pxwr_40106a12-6dd2-492f-94e0-b3ab9e866b81/kube-rbac-proxy/0.log" Oct 02 20:34:43 crc kubenswrapper[4909]: I1002 20:34:43.426963 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-8pxwr_40106a12-6dd2-492f-94e0-b3ab9e866b81/manager/0.log" Oct 02 20:34:43 crc kubenswrapper[4909]: I1002 20:34:43.463319 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-769bf6645d-mpcx5_8e79ecec-ac98-42dd-b513-071fbd0e235a/manager/0.log" Oct 02 20:34:49 crc kubenswrapper[4909]: I1002 20:34:49.920290 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:49 crc kubenswrapper[4909]: I1002 20:34:49.993075 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:50 crc kubenswrapper[4909]: I1002 20:34:50.168561 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-79db2"] Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.356550 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-79db2" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="registry-server" containerID="cri-o://24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83" gracePeriod=2 Oct 02 20:34:51 crc kubenswrapper[4909]: E1002 20:34:51.575963 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48693696_85b8_4dab_ab71_9794add683d7.slice/crio-24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83.scope\": RecentStats: unable to find data in memory cache]" Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.920017 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.972214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-utilities\") pod \"48693696-85b8-4dab-ab71-9794add683d7\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.972518 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc7x2\" (UniqueName: \"kubernetes.io/projected/48693696-85b8-4dab-ab71-9794add683d7-kube-api-access-cc7x2\") pod \"48693696-85b8-4dab-ab71-9794add683d7\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.972601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-catalog-content\") pod \"48693696-85b8-4dab-ab71-9794add683d7\" (UID: \"48693696-85b8-4dab-ab71-9794add683d7\") " Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.973184 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-utilities" (OuterVolumeSpecName: "utilities") pod "48693696-85b8-4dab-ab71-9794add683d7" (UID: "48693696-85b8-4dab-ab71-9794add683d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.973619 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:51 crc kubenswrapper[4909]: I1002 20:34:51.980466 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48693696-85b8-4dab-ab71-9794add683d7-kube-api-access-cc7x2" (OuterVolumeSpecName: "kube-api-access-cc7x2") pod "48693696-85b8-4dab-ab71-9794add683d7" (UID: "48693696-85b8-4dab-ab71-9794add683d7"). InnerVolumeSpecName "kube-api-access-cc7x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.053933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48693696-85b8-4dab-ab71-9794add683d7" (UID: "48693696-85b8-4dab-ab71-9794add683d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.076052 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc7x2\" (UniqueName: \"kubernetes.io/projected/48693696-85b8-4dab-ab71-9794add683d7-kube-api-access-cc7x2\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.076087 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48693696-85b8-4dab-ab71-9794add683d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.364496 4909 generic.go:334] "Generic (PLEG): container finished" podID="48693696-85b8-4dab-ab71-9794add683d7" containerID="24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83" exitCode=0 Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.364553 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79db2" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.364568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerDied","Data":"24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83"} Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.364655 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79db2" event={"ID":"48693696-85b8-4dab-ab71-9794add683d7","Type":"ContainerDied","Data":"bf3b4c76b9643cf24ce2ce31e1480c55a91070381e040252da4f1230baaca0f8"} Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.364692 4909 scope.go:117] "RemoveContainer" containerID="24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.391896 4909 scope.go:117] "RemoveContainer" containerID="024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.411503 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-79db2"] Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.422283 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-79db2"] Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.426043 4909 scope.go:117] "RemoveContainer" containerID="7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.473226 4909 scope.go:117] "RemoveContainer" containerID="24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83" Oct 02 20:34:52 crc kubenswrapper[4909]: E1002 20:34:52.474173 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83\": container with ID starting with 24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83 not found: ID does not exist" containerID="24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.474235 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83"} err="failed to get container status \"24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83\": rpc error: code = NotFound desc = could not find container \"24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83\": container with ID starting with 24d253366870550def85cefb4166a9d93515aa5aa44f6f524ca391a9d9a5db83 not found: ID does not exist" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.474274 4909 scope.go:117] "RemoveContainer" containerID="024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090" Oct 02 20:34:52 crc kubenswrapper[4909]: E1002 20:34:52.474649 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090\": container with ID starting with 024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090 not found: ID does not exist" containerID="024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.474678 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090"} err="failed to get container status \"024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090\": rpc error: code = NotFound desc = could not find container \"024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090\": container with ID starting with 024a4cdc3cff5d944353f7abf581790cdc903d0b7e5ba23718dd897f30414090 not found: ID does not exist" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.474697 4909 scope.go:117] "RemoveContainer" containerID="7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab" Oct 02 20:34:52 crc kubenswrapper[4909]: E1002 20:34:52.475421 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab\": container with ID starting with 7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab not found: ID does not exist" containerID="7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab" Oct 02 20:34:52 crc kubenswrapper[4909]: I1002 20:34:52.475450 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab"} err="failed to get container status \"7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab\": rpc error: code = NotFound desc = could not find container \"7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab\": container with ID starting with 7e1e2174835ce4cf63e903dcf9f790b94ae77e788b4ede951e5fbca3ad6879ab not found: ID does not exist" Oct 02 20:34:53 crc kubenswrapper[4909]: I1002 20:34:53.629224 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48693696-85b8-4dab-ab71-9794add683d7" path="/var/lib/kubelet/pods/48693696-85b8-4dab-ab71-9794add683d7/volumes" Oct 02 20:34:59 crc kubenswrapper[4909]: I1002 20:34:59.462186 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h4pvv_a7cc0f59-db36-4e76-93fe-4c99f3e621a0/control-plane-machine-set-operator/0.log" Oct 02 20:34:59 crc kubenswrapper[4909]: I1002 20:34:59.619950 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fgbzj_4bcc6ed7-1c31-4831-8b45-2729aaa5f89c/kube-rbac-proxy/0.log" Oct 02 20:34:59 crc kubenswrapper[4909]: I1002 20:34:59.677484 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fgbzj_4bcc6ed7-1c31-4831-8b45-2729aaa5f89c/machine-api-operator/0.log" Oct 02 20:35:12 crc kubenswrapper[4909]: I1002 20:35:12.243848 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wscv9_69b56b01-8070-4514-8168-51f33b7a2d07/cert-manager-controller/0.log" Oct 02 20:35:12 crc kubenswrapper[4909]: I1002 20:35:12.290079 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-lf4lx_48294402-b72e-4b64-a227-a4ccb355ef9f/cert-manager-cainjector/0.log" Oct 02 20:35:12 crc kubenswrapper[4909]: I1002 20:35:12.362135 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nqkz8_52d78023-98ec-431e-a697-0aab89fc7e8a/cert-manager-webhook/0.log" Oct 02 20:35:25 crc kubenswrapper[4909]: I1002 20:35:25.589989 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-s4vhz_a0702a66-247b-4591-95cf-2ee69ffb5473/nmstate-console-plugin/0.log" Oct 02 20:35:25 crc kubenswrapper[4909]: I1002 20:35:25.744676 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gdng9_416821c4-4252-4c41-9e8d-5bf7689aae61/nmstate-handler/0.log" Oct 02 20:35:25 crc kubenswrapper[4909]: I1002 20:35:25.799512 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gts4l_f283eb50-d6fb-464c-a075-2e79f4d56305/kube-rbac-proxy/0.log" Oct 02 20:35:25 crc kubenswrapper[4909]: I1002 20:35:25.838278 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gts4l_f283eb50-d6fb-464c-a075-2e79f4d56305/nmstate-metrics/0.log" Oct 02 20:35:25 crc kubenswrapper[4909]: I1002 20:35:25.960833 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-vlhrp_2c0fec14-d9a3-4b4d-96dd-a22938d3c736/nmstate-operator/0.log" Oct 02 20:35:26 crc kubenswrapper[4909]: I1002 20:35:26.046867 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-mfztc_e036fd92-1027-4ac0-bde4-1c69c7fa7d4c/nmstate-webhook/0.log" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.123178 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wcrq8"] Oct 02 20:35:30 crc kubenswrapper[4909]: E1002 20:35:30.124209 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682139e2-ad94-40b3-beef-8fbfb2e9571f" containerName="container-00" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.124232 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="682139e2-ad94-40b3-beef-8fbfb2e9571f" containerName="container-00" Oct 02 20:35:30 crc kubenswrapper[4909]: E1002 20:35:30.124253 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="registry-server" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.124266 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="registry-server" Oct 02 20:35:30 crc kubenswrapper[4909]: E1002 20:35:30.124321 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="extract-content" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.124335 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="extract-content" Oct 02 20:35:30 crc kubenswrapper[4909]: E1002 20:35:30.124349 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="extract-utilities" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.124361 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="extract-utilities" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.124769 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="682139e2-ad94-40b3-beef-8fbfb2e9571f" containerName="container-00" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.124828 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="48693696-85b8-4dab-ab71-9794add683d7" containerName="registry-server" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.127645 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.139913 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wcrq8"] Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.278316 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-catalog-content\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.278499 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcp6\" (UniqueName: \"kubernetes.io/projected/fd1d76a7-afa0-4bce-9294-59e2348c83ea-kube-api-access-rqcp6\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.278729 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-utilities\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.381366 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-catalog-content\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.381781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-catalog-content\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.381898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcp6\" (UniqueName: \"kubernetes.io/projected/fd1d76a7-afa0-4bce-9294-59e2348c83ea-kube-api-access-rqcp6\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.382278 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-utilities\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.382530 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-utilities\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.409193 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcp6\" (UniqueName: \"kubernetes.io/projected/fd1d76a7-afa0-4bce-9294-59e2348c83ea-kube-api-access-rqcp6\") pod \"certified-operators-wcrq8\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:30 crc kubenswrapper[4909]: I1002 20:35:30.458377 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:31 crc kubenswrapper[4909]: I1002 20:35:30.999862 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wcrq8"] Oct 02 20:35:31 crc kubenswrapper[4909]: I1002 20:35:31.813552 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerID="72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f" exitCode=0 Oct 02 20:35:31 crc kubenswrapper[4909]: I1002 20:35:31.813735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerDied","Data":"72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f"} Oct 02 20:35:31 crc kubenswrapper[4909]: I1002 20:35:31.814284 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerStarted","Data":"9894db64eb4a0a88500bb4bcc0d7faa2918e2141526ca2fe3386fcb44ef27452"} Oct 02 20:35:32 crc kubenswrapper[4909]: I1002 20:35:32.827056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerStarted","Data":"461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7"} Oct 02 20:35:33 crc kubenswrapper[4909]: I1002 20:35:33.839304 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerID="461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7" exitCode=0 Oct 02 20:35:33 crc kubenswrapper[4909]: I1002 20:35:33.839419 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerDied","Data":"461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7"} Oct 02 20:35:34 crc kubenswrapper[4909]: I1002 20:35:34.853078 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerStarted","Data":"bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153"} Oct 02 20:35:34 crc kubenswrapper[4909]: I1002 20:35:34.875246 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wcrq8" podStartSLOduration=2.149844372 podStartE2EDuration="4.875227968s" podCreationTimestamp="2025-10-02 20:35:30 +0000 UTC" firstStartedPulling="2025-10-02 20:35:31.816997632 +0000 UTC m=+8253.004493491" lastFinishedPulling="2025-10-02 20:35:34.542381228 +0000 UTC m=+8255.729877087" observedRunningTime="2025-10-02 20:35:34.873867176 +0000 UTC m=+8256.061363035" watchObservedRunningTime="2025-10-02 20:35:34.875227968 +0000 UTC m=+8256.062723827" Oct 02 20:35:38 crc kubenswrapper[4909]: I1002 20:35:38.647306 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/kube-rbac-proxy/0.log" Oct 02 20:35:38 crc kubenswrapper[4909]: I1002 20:35:38.750338 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/manager/0.log" Oct 02 20:35:40 crc kubenswrapper[4909]: I1002 20:35:40.459148 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:40 crc kubenswrapper[4909]: I1002 20:35:40.459567 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:40 crc kubenswrapper[4909]: I1002 20:35:40.513903 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:40 crc kubenswrapper[4909]: I1002 20:35:40.976009 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:41 crc kubenswrapper[4909]: I1002 20:35:41.038221 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wcrq8"] Oct 02 20:35:42 crc kubenswrapper[4909]: I1002 20:35:42.937875 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wcrq8" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="registry-server" containerID="cri-o://bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153" gracePeriod=2 Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.481081 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.515434 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqcp6\" (UniqueName: \"kubernetes.io/projected/fd1d76a7-afa0-4bce-9294-59e2348c83ea-kube-api-access-rqcp6\") pod \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.515679 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-utilities\") pod \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.515797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-catalog-content\") pod \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\" (UID: \"fd1d76a7-afa0-4bce-9294-59e2348c83ea\") " Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.523456 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-utilities" (OuterVolumeSpecName: "utilities") pod "fd1d76a7-afa0-4bce-9294-59e2348c83ea" (UID: "fd1d76a7-afa0-4bce-9294-59e2348c83ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.541344 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1d76a7-afa0-4bce-9294-59e2348c83ea-kube-api-access-rqcp6" (OuterVolumeSpecName: "kube-api-access-rqcp6") pod "fd1d76a7-afa0-4bce-9294-59e2348c83ea" (UID: "fd1d76a7-afa0-4bce-9294-59e2348c83ea"). InnerVolumeSpecName "kube-api-access-rqcp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.584914 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd1d76a7-afa0-4bce-9294-59e2348c83ea" (UID: "fd1d76a7-afa0-4bce-9294-59e2348c83ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.621458 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqcp6\" (UniqueName: \"kubernetes.io/projected/fd1d76a7-afa0-4bce-9294-59e2348c83ea-kube-api-access-rqcp6\") on node \"crc\" DevicePath \"\"" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.621490 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.621503 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd1d76a7-afa0-4bce-9294-59e2348c83ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.950146 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerID="bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153" exitCode=0 Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.950188 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerDied","Data":"bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153"} Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.950216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcrq8" event={"ID":"fd1d76a7-afa0-4bce-9294-59e2348c83ea","Type":"ContainerDied","Data":"9894db64eb4a0a88500bb4bcc0d7faa2918e2141526ca2fe3386fcb44ef27452"} Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.950232 4909 scope.go:117] "RemoveContainer" containerID="bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.950232 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcrq8" Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.980827 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wcrq8"] Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.995636 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wcrq8"] Oct 02 20:35:43 crc kubenswrapper[4909]: I1002 20:35:43.996400 4909 scope.go:117] "RemoveContainer" containerID="461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.019275 4909 scope.go:117] "RemoveContainer" containerID="72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.076238 4909 scope.go:117] "RemoveContainer" containerID="bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153" Oct 02 20:35:44 crc kubenswrapper[4909]: E1002 20:35:44.076678 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153\": container with ID starting with bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153 not found: ID does not exist" containerID="bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.076709 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153"} err="failed to get container status \"bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153\": rpc error: code = NotFound desc = could not find container \"bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153\": container with ID starting with bb257f01575bad0d735459ccb0a648b7f15addb73c4f3cb59a392b84379bc153 not found: ID does not exist" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.076728 4909 scope.go:117] "RemoveContainer" containerID="461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7" Oct 02 20:35:44 crc kubenswrapper[4909]: E1002 20:35:44.076978 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7\": container with ID starting with 461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7 not found: ID does not exist" containerID="461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.076999 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7"} err="failed to get container status \"461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7\": rpc error: code = NotFound desc = could not find container \"461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7\": container with ID starting with 461992c6c89f357338ee7df92abe74d5a3f84eaaa9de6a41d3025090b19e27e7 not found: ID does not exist" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.077013 4909 scope.go:117] "RemoveContainer" containerID="72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f" Oct 02 20:35:44 crc kubenswrapper[4909]: E1002 20:35:44.077420 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f\": container with ID starting with 72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f not found: ID does not exist" containerID="72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f" Oct 02 20:35:44 crc kubenswrapper[4909]: I1002 20:35:44.077443 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f"} err="failed to get container status \"72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f\": rpc error: code = NotFound desc = could not find container \"72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f\": container with ID starting with 72a90db327f07d46e6faa27d28a40687bc047e7b0ab5bda25e6f51b53244ed9f not found: ID does not exist" Oct 02 20:35:45 crc kubenswrapper[4909]: I1002 20:35:45.621725 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" path="/var/lib/kubelet/pods/fd1d76a7-afa0-4bce-9294-59e2348c83ea/volumes" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.036457 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-8958c8b87-fqrjj_63728faa-74d7-4f8c-baab-348f0a26da5b/cluster-logging-operator/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.207691 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_90528ba7-f037-4809-959c-26c1a511bb84/loki-compactor/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.213785 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-t77xg_3903e4b2-91fd-4a38-880e-543863535cf5/collector/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.365530 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-6f5f7fff97-897dd_68d54b28-bf98-4e97-a5f8-9cc1abb31a5d/loki-distributor/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.442233 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-5dfzs_1714d70d-b81b-4886-816f-da1588c7364a/gateway/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.483630 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-5dfzs_1714d70d-b81b-4886-816f-da1588c7364a/opa/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.634940 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-htpg9_42692035-de07-49cc-b2f6-2305a4ff6f31/gateway/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.652318 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5994fd858f-htpg9_42692035-de07-49cc-b2f6-2305a4ff6f31/opa/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.823735 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_f384831b-fa88-451d-9e4c-3181bd39ed42/loki-index-gateway/0.log" Oct 02 20:35:53 crc kubenswrapper[4909]: I1002 20:35:53.939087 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_3af72dc6-9572-44cd-b4b8-cab3a6857f08/loki-ingester/0.log" Oct 02 20:35:54 crc kubenswrapper[4909]: I1002 20:35:54.027886 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5d954896cf-x8snb_8203105d-afce-403b-8a70-d624672e2826/loki-querier/0.log" Oct 02 20:35:54 crc kubenswrapper[4909]: I1002 20:35:54.139491 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6fbbbc8b7d-pp4hn_1ddf8d24-c907-43af-bb68-ee9a2c28fd67/loki-query-frontend/0.log" Oct 02 20:36:09 crc kubenswrapper[4909]: I1002 20:36:09.627046 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-vwbdq_6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9/kube-rbac-proxy/0.log" Oct 02 20:36:09 crc kubenswrapper[4909]: I1002 20:36:09.798719 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-vwbdq_6dd998da-5dc6-4aac-9fe5-664a3d0cc7f9/controller/0.log" Oct 02 20:36:09 crc kubenswrapper[4909]: I1002 20:36:09.904037 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.027859 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.079851 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.081660 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.131510 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.308314 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.308972 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.309252 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.377903 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.576478 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-reloader/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.579439 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-metrics/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.591471 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/cp-frr-files/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.634644 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/controller/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.789895 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/frr-metrics/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.820843 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/kube-rbac-proxy/0.log" Oct 02 20:36:10 crc kubenswrapper[4909]: I1002 20:36:10.853265 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/kube-rbac-proxy-frr/0.log" Oct 02 20:36:11 crc kubenswrapper[4909]: I1002 20:36:11.034868 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/reloader/0.log" Oct 02 20:36:11 crc kubenswrapper[4909]: I1002 20:36:11.090015 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-9wd5p_94246755-9836-440f-a7d3-5189dd6d1b6f/frr-k8s-webhook-server/0.log" Oct 02 20:36:11 crc kubenswrapper[4909]: I1002 20:36:11.376368 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-568bfffc64-vr4t5_04606b24-1344-4bc4-a7b5-c5bd3282afec/manager/0.log" Oct 02 20:36:11 crc kubenswrapper[4909]: I1002 20:36:11.479712 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-f8cdfbb4-bh9bj_9ffcfe84-1e40-4d9f-b411-4f8707346e92/webhook-server/0.log" Oct 02 20:36:11 crc kubenswrapper[4909]: I1002 20:36:11.600435 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x67sq_7da13784-fb30-48c8-8a21-99e151b56645/kube-rbac-proxy/0.log" Oct 02 20:36:12 crc kubenswrapper[4909]: I1002 20:36:12.300213 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x67sq_7da13784-fb30-48c8-8a21-99e151b56645/speaker/0.log" Oct 02 20:36:12 crc kubenswrapper[4909]: I1002 20:36:12.897195 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lwgmx_843ddbf5-2368-4807-903e-41a960b8e78e/frr/0.log" Oct 02 20:36:23 crc kubenswrapper[4909]: I1002 20:36:23.054941 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:36:23 crc kubenswrapper[4909]: I1002 20:36:23.057199 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:36:26 crc kubenswrapper[4909]: I1002 20:36:26.621202 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/util/0.log" Oct 02 20:36:26 crc kubenswrapper[4909]: I1002 20:36:26.789471 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/pull/0.log" Oct 02 20:36:26 crc kubenswrapper[4909]: I1002 20:36:26.814464 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/util/0.log" Oct 02 20:36:26 crc kubenswrapper[4909]: I1002 20:36:26.850390 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/pull/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.016559 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/util/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.018012 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/pull/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.049307 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37dldcbb_34d4acd8-da3b-4b4d-801f-4f7ccc6cac01/extract/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.236838 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/util/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.372788 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/pull/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.408297 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/util/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.463717 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/pull/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.596682 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/util/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.630613 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/extract/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.647512 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hh4m8_c68c1569-5f7c-405d-8179-795ab29eb2b4/pull/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.791917 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/util/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.983362 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/util/0.log" Oct 02 20:36:27 crc kubenswrapper[4909]: I1002 20:36:27.992754 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/pull/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.025653 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/pull/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.205367 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/extract/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.227314 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/pull/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.252675 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d864vg_9dc24505-fbd1-404b-b8c8-9b14f01fc1a8/util/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.427185 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/util/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.619201 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/pull/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.641377 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/util/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.667126 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/pull/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.842792 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/util/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.865549 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/extract/0.log" Oct 02 20:36:28 crc kubenswrapper[4909]: I1002 20:36:28.896458 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa601rg4q9_1f391df0-3f52-4966-8661-04ecd7d41088/pull/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.064839 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-utilities/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.227841 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-content/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.252158 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-content/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.266339 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-utilities/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.395395 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-utilities/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.451779 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/extract-content/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.626589 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-utilities/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.819570 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-content/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.859770 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-content/0.log" Oct 02 20:36:29 crc kubenswrapper[4909]: I1002 20:36:29.889424 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-utilities/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.118872 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-utilities/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.124151 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/extract-content/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.337839 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/util/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.477089 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wflc9_7453f0a3-5410-4768-ac27-88ac0ad93046/registry-server/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.595332 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/pull/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.597633 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/util/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.630367 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/pull/0.log" Oct 02 20:36:30 crc kubenswrapper[4909]: I1002 20:36:30.798892 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6hbk5_ab21d0da-ea4e-46e7-a9b7-cd0f81b3dddd/registry-server/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.008195 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/extract/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.015906 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/util/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.038550 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835czv7jq_c49d92a4-6655-4c61-b381-b8d52b36399b/pull/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.091546 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zvd2b_e7e3b138-c134-41ec-a5d6-af2e97914045/marketplace-operator/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.183627 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-utilities/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.390947 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-utilities/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.394451 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-content/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.407328 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-content/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.565124 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-utilities/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.641519 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-utilities/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.649837 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/extract-content/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.888547 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vdc2w_63f1acf8-a7a9-4a20-a6b7-fec8d828619d/registry-server/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.889750 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-utilities/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.902604 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-content/0.log" Oct 02 20:36:31 crc kubenswrapper[4909]: I1002 20:36:31.916830 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-content/0.log" Oct 02 20:36:32 crc kubenswrapper[4909]: I1002 20:36:32.075689 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-content/0.log" Oct 02 20:36:32 crc kubenswrapper[4909]: I1002 20:36:32.090391 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/extract-utilities/0.log" Oct 02 20:36:32 crc kubenswrapper[4909]: I1002 20:36:32.977695 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khlvb_af93c2fc-1286-4c32-b8ec-b582a58114b8/registry-server/0.log" Oct 02 20:36:44 crc kubenswrapper[4909]: I1002 20:36:44.998672 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-zhfgm_355b5784-3dc4-4a65-93d8-4bdddd018358/prometheus-operator/0.log" Oct 02 20:36:45 crc kubenswrapper[4909]: I1002 20:36:45.125908 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d9847b6d-gntmk_109ab413-2eeb-42e1-b39f-feddfe589bcc/prometheus-operator-admission-webhook/0.log" Oct 02 20:36:45 crc kubenswrapper[4909]: I1002 20:36:45.132375 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d9847b6d-s8hrb_38edcfed-aa44-406d-9028-8395eb3ebb06/prometheus-operator-admission-webhook/0.log" Oct 02 20:36:45 crc kubenswrapper[4909]: I1002 20:36:45.285976 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-mwrxx_e4b958c5-5914-4978-884c-e9f42430f52f/operator/0.log" Oct 02 20:36:45 crc kubenswrapper[4909]: I1002 20:36:45.303497 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-6584dc9448-rtc7s_f89fa753-5d7e-4c80-8847-25eeaee0c3e3/observability-ui-dashboards/0.log" Oct 02 20:36:45 crc kubenswrapper[4909]: I1002 20:36:45.446379 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-9btfm_671892a9-6a46-445d-b470-6c16b55b8818/perses-operator/0.log" Oct 02 20:36:53 crc kubenswrapper[4909]: I1002 20:36:53.055065 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:36:53 crc kubenswrapper[4909]: I1002 20:36:53.055854 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:36:58 crc kubenswrapper[4909]: I1002 20:36:58.287798 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/kube-rbac-proxy/0.log" Oct 02 20:36:58 crc kubenswrapper[4909]: I1002 20:36:58.380749 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-576dc5b57d-njbh7_32aac12f-cd5f-4a59-8b82-051057ed0e70/manager/0.log" Oct 02 20:37:23 crc kubenswrapper[4909]: I1002 20:37:23.054874 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:37:23 crc kubenswrapper[4909]: I1002 20:37:23.055348 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:37:23 crc kubenswrapper[4909]: I1002 20:37:23.055399 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:37:23 crc kubenswrapper[4909]: I1002 20:37:23.056367 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b60f7aa65d898dd00a1ccbd0a0ab4f4c381eb31dafcc4397ea86ae1732a7e36d"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:37:23 crc kubenswrapper[4909]: I1002 20:37:23.056424 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://b60f7aa65d898dd00a1ccbd0a0ab4f4c381eb31dafcc4397ea86ae1732a7e36d" gracePeriod=600 Oct 02 20:37:24 crc kubenswrapper[4909]: I1002 20:37:24.097898 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="b60f7aa65d898dd00a1ccbd0a0ab4f4c381eb31dafcc4397ea86ae1732a7e36d" exitCode=0 Oct 02 20:37:24 crc kubenswrapper[4909]: I1002 20:37:24.097987 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"b60f7aa65d898dd00a1ccbd0a0ab4f4c381eb31dafcc4397ea86ae1732a7e36d"} Oct 02 20:37:24 crc kubenswrapper[4909]: I1002 20:37:24.098236 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerStarted","Data":"8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8"} Oct 02 20:37:24 crc kubenswrapper[4909]: I1002 20:37:24.098268 4909 scope.go:117] "RemoveContainer" containerID="c237d1888dbf0af33e1e4a6d5d45c7c973d9b86c04189c1edc363b62b9601c32" Oct 02 20:38:31 crc kubenswrapper[4909]: I1002 20:38:31.325785 4909 scope.go:117] "RemoveContainer" containerID="7d11fc1a006b7f7e5be5ea5b05e2eae101094ad94004594fb70dd61424502683" Oct 02 20:39:23 crc kubenswrapper[4909]: I1002 20:39:23.054855 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:39:23 crc kubenswrapper[4909]: I1002 20:39:23.056168 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:39:28 crc kubenswrapper[4909]: I1002 20:39:28.839536 4909 generic.go:334] "Generic (PLEG): container finished" podID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerID="341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e" exitCode=0 Oct 02 20:39:28 crc kubenswrapper[4909]: I1002 20:39:28.839669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" event={"ID":"2f46ef23-b745-4df8-a41b-65046d7a873e","Type":"ContainerDied","Data":"341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e"} Oct 02 20:39:28 crc kubenswrapper[4909]: I1002 20:39:28.841271 4909 scope.go:117] "RemoveContainer" containerID="341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e" Oct 02 20:39:29 crc kubenswrapper[4909]: I1002 20:39:29.541708 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xbqd_must-gather-d9bgq_2f46ef23-b745-4df8-a41b-65046d7a873e/gather/0.log" Oct 02 20:39:44 crc kubenswrapper[4909]: I1002 20:39:44.398627 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xbqd/must-gather-d9bgq"] Oct 02 20:39:44 crc kubenswrapper[4909]: I1002 20:39:44.399672 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="copy" containerID="cri-o://a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847" gracePeriod=2 Oct 02 20:39:44 crc kubenswrapper[4909]: I1002 20:39:44.420315 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xbqd/must-gather-d9bgq"] Oct 02 20:39:44 crc kubenswrapper[4909]: I1002 20:39:44.955149 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xbqd_must-gather-d9bgq_2f46ef23-b745-4df8-a41b-65046d7a873e/copy/0.log" Oct 02 20:39:44 crc kubenswrapper[4909]: I1002 20:39:44.955847 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.053703 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xbqd_must-gather-d9bgq_2f46ef23-b745-4df8-a41b-65046d7a873e/copy/0.log" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.054166 4909 generic.go:334] "Generic (PLEG): container finished" podID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerID="a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847" exitCode=143 Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.054235 4909 scope.go:117] "RemoveContainer" containerID="a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.054230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xbqd/must-gather-d9bgq" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.054424 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dvk6\" (UniqueName: \"kubernetes.io/projected/2f46ef23-b745-4df8-a41b-65046d7a873e-kube-api-access-9dvk6\") pod \"2f46ef23-b745-4df8-a41b-65046d7a873e\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.054639 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2f46ef23-b745-4df8-a41b-65046d7a873e-must-gather-output\") pod \"2f46ef23-b745-4df8-a41b-65046d7a873e\" (UID: \"2f46ef23-b745-4df8-a41b-65046d7a873e\") " Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.079464 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f46ef23-b745-4df8-a41b-65046d7a873e-kube-api-access-9dvk6" (OuterVolumeSpecName: "kube-api-access-9dvk6") pod "2f46ef23-b745-4df8-a41b-65046d7a873e" (UID: "2f46ef23-b745-4df8-a41b-65046d7a873e"). InnerVolumeSpecName "kube-api-access-9dvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.096899 4909 scope.go:117] "RemoveContainer" containerID="341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.167762 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dvk6\" (UniqueName: \"kubernetes.io/projected/2f46ef23-b745-4df8-a41b-65046d7a873e-kube-api-access-9dvk6\") on node \"crc\" DevicePath \"\"" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.200376 4909 scope.go:117] "RemoveContainer" containerID="a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847" Oct 02 20:39:45 crc kubenswrapper[4909]: E1002 20:39:45.200790 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847\": container with ID starting with a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847 not found: ID does not exist" containerID="a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.200836 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847"} err="failed to get container status \"a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847\": rpc error: code = NotFound desc = could not find container \"a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847\": container with ID starting with a8c081aebe49e99317d7afebd5e5012c0f2847c25d60db466d75c5b741c5b847 not found: ID does not exist" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.200861 4909 scope.go:117] "RemoveContainer" containerID="341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e" Oct 02 20:39:45 crc kubenswrapper[4909]: E1002 20:39:45.201160 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e\": container with ID starting with 341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e not found: ID does not exist" containerID="341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.201185 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e"} err="failed to get container status \"341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e\": rpc error: code = NotFound desc = could not find container \"341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e\": container with ID starting with 341ef53a9f6a4989956cf7b4ffb87c3497da0a3639b3c38b62d466cda0761d1e not found: ID does not exist" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.237547 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f46ef23-b745-4df8-a41b-65046d7a873e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2f46ef23-b745-4df8-a41b-65046d7a873e" (UID: "2f46ef23-b745-4df8-a41b-65046d7a873e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.269504 4909 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2f46ef23-b745-4df8-a41b-65046d7a873e-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 20:39:45 crc kubenswrapper[4909]: I1002 20:39:45.622225 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" path="/var/lib/kubelet/pods/2f46ef23-b745-4df8-a41b-65046d7a873e/volumes" Oct 02 20:39:53 crc kubenswrapper[4909]: I1002 20:39:53.055233 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:39:53 crc kubenswrapper[4909]: I1002 20:39:53.056078 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.054791 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4777h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.055488 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.055557 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4777h" Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.056532 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8"} pod="openshift-machine-config-operator/machine-config-daemon-4777h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.056627 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" containerName="machine-config-daemon" containerID="cri-o://8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" gracePeriod=600 Oct 02 20:40:23 crc kubenswrapper[4909]: E1002 20:40:23.211186 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.522544 4909 generic.go:334] "Generic (PLEG): container finished" podID="31958374-7b04-45be-9509-c51e08f9afe2" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" exitCode=0 Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.522910 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4777h" event={"ID":"31958374-7b04-45be-9509-c51e08f9afe2","Type":"ContainerDied","Data":"8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8"} Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.522951 4909 scope.go:117] "RemoveContainer" containerID="b60f7aa65d898dd00a1ccbd0a0ab4f4c381eb31dafcc4397ea86ae1732a7e36d" Oct 02 20:40:23 crc kubenswrapper[4909]: I1002 20:40:23.523802 4909 scope.go:117] "RemoveContainer" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" Oct 02 20:40:23 crc kubenswrapper[4909]: E1002 20:40:23.524167 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.935885 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zxfrz"] Oct 02 20:40:30 crc kubenswrapper[4909]: E1002 20:40:30.937256 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="gather" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937280 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="gather" Oct 02 20:40:30 crc kubenswrapper[4909]: E1002 20:40:30.937319 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="registry-server" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937332 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="registry-server" Oct 02 20:40:30 crc kubenswrapper[4909]: E1002 20:40:30.937361 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="extract-utilities" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937379 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="extract-utilities" Oct 02 20:40:30 crc kubenswrapper[4909]: E1002 20:40:30.937411 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="copy" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937425 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="copy" Oct 02 20:40:30 crc kubenswrapper[4909]: E1002 20:40:30.937482 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="extract-content" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937494 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="extract-content" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937887 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="gather" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937910 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f46ef23-b745-4df8-a41b-65046d7a873e" containerName="copy" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.937935 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1d76a7-afa0-4bce-9294-59e2348c83ea" containerName="registry-server" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.941119 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:30 crc kubenswrapper[4909]: I1002 20:40:30.957861 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zxfrz"] Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.034139 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfjb\" (UniqueName: \"kubernetes.io/projected/2649c93e-7819-4899-a54e-75c72ce60ff9-kube-api-access-scfjb\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.034320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-utilities\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.034517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-catalog-content\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.111221 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ngzbj"] Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.113509 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.126916 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngzbj"] Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.137718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfjb\" (UniqueName: \"kubernetes.io/projected/2649c93e-7819-4899-a54e-75c72ce60ff9-kube-api-access-scfjb\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.137830 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-utilities\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.137931 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-catalog-content\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.139074 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-catalog-content\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.139082 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-utilities\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.161276 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfjb\" (UniqueName: \"kubernetes.io/projected/2649c93e-7819-4899-a54e-75c72ce60ff9-kube-api-access-scfjb\") pod \"redhat-marketplace-zxfrz\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.239885 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-catalog-content\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.240270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5bhn\" (UniqueName: \"kubernetes.io/projected/71c23e65-b99d-4e4c-a264-0e8993c77ffa-kube-api-access-c5bhn\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.240457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-utilities\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.302312 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.342509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-catalog-content\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.342977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5bhn\" (UniqueName: \"kubernetes.io/projected/71c23e65-b99d-4e4c-a264-0e8993c77ffa-kube-api-access-c5bhn\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.343103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-catalog-content\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.343266 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-utilities\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.344179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-utilities\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.364748 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5bhn\" (UniqueName: \"kubernetes.io/projected/71c23e65-b99d-4e4c-a264-0e8993c77ffa-kube-api-access-c5bhn\") pod \"community-operators-ngzbj\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.443820 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.450401 4909 scope.go:117] "RemoveContainer" containerID="078ae825d31084f0fd5758fa7c65edce3e056e7ecadae749d408dedbd8dcf582" Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.809003 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zxfrz"] Oct 02 20:40:31 crc kubenswrapper[4909]: I1002 20:40:31.993249 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngzbj"] Oct 02 20:40:31 crc kubenswrapper[4909]: W1002 20:40:31.997994 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c23e65_b99d_4e4c_a264_0e8993c77ffa.slice/crio-cb0dcda76a2f509fc60ac8330733470d61cc7f8d96642c86ee34444ac0feb965 WatchSource:0}: Error finding container cb0dcda76a2f509fc60ac8330733470d61cc7f8d96642c86ee34444ac0feb965: Status 404 returned error can't find the container with id cb0dcda76a2f509fc60ac8330733470d61cc7f8d96642c86ee34444ac0feb965 Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.690573 4909 generic.go:334] "Generic (PLEG): container finished" podID="71c23e65-b99d-4e4c-a264-0e8993c77ffa" containerID="d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f" exitCode=0 Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.690985 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerDied","Data":"d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f"} Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.691018 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerStarted","Data":"cb0dcda76a2f509fc60ac8330733470d61cc7f8d96642c86ee34444ac0feb965"} Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.718263 4909 generic.go:334] "Generic (PLEG): container finished" podID="2649c93e-7819-4899-a54e-75c72ce60ff9" containerID="cbf04ed4d397c5ab090d3ff29298f69910a355b3b7ffa074554a80b1957a1b5d" exitCode=0 Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.718303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerDied","Data":"cbf04ed4d397c5ab090d3ff29298f69910a355b3b7ffa074554a80b1957a1b5d"} Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.718329 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerStarted","Data":"30133cf55941248b806223421c15f539e70edda07b828f606c2cc378c1dfad36"} Oct 02 20:40:32 crc kubenswrapper[4909]: I1002 20:40:32.718517 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 20:40:34 crc kubenswrapper[4909]: I1002 20:40:34.748798 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerStarted","Data":"45fb3dff74e9450f7759a8edb84d48f704de52001439205c66dccf838dd53db3"} Oct 02 20:40:34 crc kubenswrapper[4909]: I1002 20:40:34.752332 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerStarted","Data":"1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b"} Oct 02 20:40:35 crc kubenswrapper[4909]: I1002 20:40:35.768832 4909 generic.go:334] "Generic (PLEG): container finished" podID="2649c93e-7819-4899-a54e-75c72ce60ff9" containerID="45fb3dff74e9450f7759a8edb84d48f704de52001439205c66dccf838dd53db3" exitCode=0 Oct 02 20:40:35 crc kubenswrapper[4909]: I1002 20:40:35.768901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerDied","Data":"45fb3dff74e9450f7759a8edb84d48f704de52001439205c66dccf838dd53db3"} Oct 02 20:40:37 crc kubenswrapper[4909]: I1002 20:40:37.609175 4909 scope.go:117] "RemoveContainer" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" Oct 02 20:40:37 crc kubenswrapper[4909]: E1002 20:40:37.610232 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:40:37 crc kubenswrapper[4909]: I1002 20:40:37.795330 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerStarted","Data":"e44e25079474f363caba8f656ec78c186c162e07c63417a39b324c5262c24419"} Oct 02 20:40:37 crc kubenswrapper[4909]: I1002 20:40:37.803896 4909 generic.go:334] "Generic (PLEG): container finished" podID="71c23e65-b99d-4e4c-a264-0e8993c77ffa" containerID="1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b" exitCode=0 Oct 02 20:40:37 crc kubenswrapper[4909]: I1002 20:40:37.803992 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerDied","Data":"1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b"} Oct 02 20:40:37 crc kubenswrapper[4909]: I1002 20:40:37.840771 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zxfrz" podStartSLOduration=3.5665104960000003 podStartE2EDuration="7.840747783s" podCreationTimestamp="2025-10-02 20:40:30 +0000 UTC" firstStartedPulling="2025-10-02 20:40:32.732609605 +0000 UTC m=+8553.920105464" lastFinishedPulling="2025-10-02 20:40:37.006846852 +0000 UTC m=+8558.194342751" observedRunningTime="2025-10-02 20:40:37.837142882 +0000 UTC m=+8559.024638761" watchObservedRunningTime="2025-10-02 20:40:37.840747783 +0000 UTC m=+8559.028243652" Oct 02 20:40:39 crc kubenswrapper[4909]: I1002 20:40:39.830250 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerStarted","Data":"f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6"} Oct 02 20:40:41 crc kubenswrapper[4909]: I1002 20:40:41.302715 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:41 crc kubenswrapper[4909]: I1002 20:40:41.303085 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:41 crc kubenswrapper[4909]: I1002 20:40:41.393449 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:41 crc kubenswrapper[4909]: I1002 20:40:41.420018 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ngzbj" podStartSLOduration=4.480852952 podStartE2EDuration="10.419994853s" podCreationTimestamp="2025-10-02 20:40:31 +0000 UTC" firstStartedPulling="2025-10-02 20:40:32.718264952 +0000 UTC m=+8553.905760811" lastFinishedPulling="2025-10-02 20:40:38.657406853 +0000 UTC m=+8559.844902712" observedRunningTime="2025-10-02 20:40:39.852928708 +0000 UTC m=+8561.040424587" watchObservedRunningTime="2025-10-02 20:40:41.419994853 +0000 UTC m=+8562.607490722" Oct 02 20:40:41 crc kubenswrapper[4909]: I1002 20:40:41.444018 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:41 crc kubenswrapper[4909]: I1002 20:40:41.444088 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:42 crc kubenswrapper[4909]: I1002 20:40:42.512410 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ngzbj" podUID="71c23e65-b99d-4e4c-a264-0e8993c77ffa" containerName="registry-server" probeResult="failure" output=< Oct 02 20:40:42 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Oct 02 20:40:42 crc kubenswrapper[4909]: > Oct 02 20:40:51 crc kubenswrapper[4909]: I1002 20:40:51.372673 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:51 crc kubenswrapper[4909]: I1002 20:40:51.500939 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:51 crc kubenswrapper[4909]: I1002 20:40:51.575103 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:52 crc kubenswrapper[4909]: I1002 20:40:52.608793 4909 scope.go:117] "RemoveContainer" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" Oct 02 20:40:52 crc kubenswrapper[4909]: E1002 20:40:52.609382 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:40:53 crc kubenswrapper[4909]: I1002 20:40:53.707853 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zxfrz"] Oct 02 20:40:53 crc kubenswrapper[4909]: I1002 20:40:53.708141 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zxfrz" podUID="2649c93e-7819-4899-a54e-75c72ce60ff9" containerName="registry-server" containerID="cri-o://e44e25079474f363caba8f656ec78c186c162e07c63417a39b324c5262c24419" gracePeriod=2 Oct 02 20:40:53 crc kubenswrapper[4909]: I1002 20:40:53.912872 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngzbj"] Oct 02 20:40:53 crc kubenswrapper[4909]: I1002 20:40:53.913228 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ngzbj" podUID="71c23e65-b99d-4e4c-a264-0e8993c77ffa" containerName="registry-server" containerID="cri-o://f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6" gracePeriod=2 Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.005098 4909 generic.go:334] "Generic (PLEG): container finished" podID="2649c93e-7819-4899-a54e-75c72ce60ff9" containerID="e44e25079474f363caba8f656ec78c186c162e07c63417a39b324c5262c24419" exitCode=0 Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.005154 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerDied","Data":"e44e25079474f363caba8f656ec78c186c162e07c63417a39b324c5262c24419"} Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.368092 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.496715 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.540790 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-catalog-content\") pod \"2649c93e-7819-4899-a54e-75c72ce60ff9\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.541036 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-utilities\") pod \"2649c93e-7819-4899-a54e-75c72ce60ff9\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.541219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scfjb\" (UniqueName: \"kubernetes.io/projected/2649c93e-7819-4899-a54e-75c72ce60ff9-kube-api-access-scfjb\") pod \"2649c93e-7819-4899-a54e-75c72ce60ff9\" (UID: \"2649c93e-7819-4899-a54e-75c72ce60ff9\") " Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.547672 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2649c93e-7819-4899-a54e-75c72ce60ff9-kube-api-access-scfjb" (OuterVolumeSpecName: "kube-api-access-scfjb") pod "2649c93e-7819-4899-a54e-75c72ce60ff9" (UID: "2649c93e-7819-4899-a54e-75c72ce60ff9"). InnerVolumeSpecName "kube-api-access-scfjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.548274 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-utilities" (OuterVolumeSpecName: "utilities") pod "2649c93e-7819-4899-a54e-75c72ce60ff9" (UID: "2649c93e-7819-4899-a54e-75c72ce60ff9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.556063 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2649c93e-7819-4899-a54e-75c72ce60ff9" (UID: "2649c93e-7819-4899-a54e-75c72ce60ff9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.643044 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5bhn\" (UniqueName: \"kubernetes.io/projected/71c23e65-b99d-4e4c-a264-0e8993c77ffa-kube-api-access-c5bhn\") pod \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.643133 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-utilities\") pod \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.643235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-catalog-content\") pod \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\" (UID: \"71c23e65-b99d-4e4c-a264-0e8993c77ffa\") " Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.643955 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.644512 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2649c93e-7819-4899-a54e-75c72ce60ff9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.644574 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scfjb\" (UniqueName: \"kubernetes.io/projected/2649c93e-7819-4899-a54e-75c72ce60ff9-kube-api-access-scfjb\") on node \"crc\" DevicePath \"\"" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.644513 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-utilities" (OuterVolumeSpecName: "utilities") pod "71c23e65-b99d-4e4c-a264-0e8993c77ffa" (UID: "71c23e65-b99d-4e4c-a264-0e8993c77ffa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.650527 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c23e65-b99d-4e4c-a264-0e8993c77ffa-kube-api-access-c5bhn" (OuterVolumeSpecName: "kube-api-access-c5bhn") pod "71c23e65-b99d-4e4c-a264-0e8993c77ffa" (UID: "71c23e65-b99d-4e4c-a264-0e8993c77ffa"). InnerVolumeSpecName "kube-api-access-c5bhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.699478 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c23e65-b99d-4e4c-a264-0e8993c77ffa" (UID: "71c23e65-b99d-4e4c-a264-0e8993c77ffa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.747057 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5bhn\" (UniqueName: \"kubernetes.io/projected/71c23e65-b99d-4e4c-a264-0e8993c77ffa-kube-api-access-c5bhn\") on node \"crc\" DevicePath \"\"" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.747106 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 20:40:54 crc kubenswrapper[4909]: I1002 20:40:54.747125 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c23e65-b99d-4e4c-a264-0e8993c77ffa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.021523 4909 generic.go:334] "Generic (PLEG): container finished" podID="71c23e65-b99d-4e4c-a264-0e8993c77ffa" containerID="f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6" exitCode=0 Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.021650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerDied","Data":"f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6"} Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.021708 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngzbj" event={"ID":"71c23e65-b99d-4e4c-a264-0e8993c77ffa","Type":"ContainerDied","Data":"cb0dcda76a2f509fc60ac8330733470d61cc7f8d96642c86ee34444ac0feb965"} Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.021740 4909 scope.go:117] "RemoveContainer" containerID="f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.022054 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngzbj" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.040402 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zxfrz" event={"ID":"2649c93e-7819-4899-a54e-75c72ce60ff9","Type":"ContainerDied","Data":"30133cf55941248b806223421c15f539e70edda07b828f606c2cc378c1dfad36"} Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.040490 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zxfrz" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.070817 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngzbj"] Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.071241 4909 scope.go:117] "RemoveContainer" containerID="1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.084791 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ngzbj"] Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.120874 4909 scope.go:117] "RemoveContainer" containerID="d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.126889 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zxfrz"] Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.140441 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zxfrz"] Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.165156 4909 scope.go:117] "RemoveContainer" containerID="f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6" Oct 02 20:40:55 crc kubenswrapper[4909]: E1002 20:40:55.165714 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6\": container with ID starting with f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6 not found: ID does not exist" containerID="f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.165746 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6"} err="failed to get container status \"f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6\": rpc error: code = NotFound desc = could not find container \"f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6\": container with ID starting with f13fff3b1691652c57e2bc8a05d88b8866c5cfbe542014c1279dcdd2be70f0f6 not found: ID does not exist" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.165766 4909 scope.go:117] "RemoveContainer" containerID="1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b" Oct 02 20:40:55 crc kubenswrapper[4909]: E1002 20:40:55.165944 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b\": container with ID starting with 1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b not found: ID does not exist" containerID="1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.165966 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b"} err="failed to get container status \"1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b\": rpc error: code = NotFound desc = could not find container \"1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b\": container with ID starting with 1277c5439d48b483d3d59b0597906f60e501ed2d6c113f9f0f936f16b87a619b not found: ID does not exist" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.165978 4909 scope.go:117] "RemoveContainer" containerID="d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f" Oct 02 20:40:55 crc kubenswrapper[4909]: E1002 20:40:55.169374 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f\": container with ID starting with d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f not found: ID does not exist" containerID="d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.169482 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f"} err="failed to get container status \"d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f\": rpc error: code = NotFound desc = could not find container \"d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f\": container with ID starting with d707e82f16b9aa7f1c340acb5a854588b882bc459d7c639c7271307f6eda106f not found: ID does not exist" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.169553 4909 scope.go:117] "RemoveContainer" containerID="e44e25079474f363caba8f656ec78c186c162e07c63417a39b324c5262c24419" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.215170 4909 scope.go:117] "RemoveContainer" containerID="45fb3dff74e9450f7759a8edb84d48f704de52001439205c66dccf838dd53db3" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.238693 4909 scope.go:117] "RemoveContainer" containerID="cbf04ed4d397c5ab090d3ff29298f69910a355b3b7ffa074554a80b1957a1b5d" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.633253 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2649c93e-7819-4899-a54e-75c72ce60ff9" path="/var/lib/kubelet/pods/2649c93e-7819-4899-a54e-75c72ce60ff9/volumes" Oct 02 20:40:55 crc kubenswrapper[4909]: I1002 20:40:55.634395 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c23e65-b99d-4e4c-a264-0e8993c77ffa" path="/var/lib/kubelet/pods/71c23e65-b99d-4e4c-a264-0e8993c77ffa/volumes" Oct 02 20:41:03 crc kubenswrapper[4909]: I1002 20:41:03.609499 4909 scope.go:117] "RemoveContainer" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" Oct 02 20:41:03 crc kubenswrapper[4909]: E1002 20:41:03.610304 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:41:18 crc kubenswrapper[4909]: I1002 20:41:18.608763 4909 scope.go:117] "RemoveContainer" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" Oct 02 20:41:18 crc kubenswrapper[4909]: E1002 20:41:18.609680 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2" Oct 02 20:41:33 crc kubenswrapper[4909]: I1002 20:41:33.609283 4909 scope.go:117] "RemoveContainer" containerID="8af59e2d51cee42580bd07c3af8c53ead5bc9ce3a3fcdb4221893e13538802d8" Oct 02 20:41:33 crc kubenswrapper[4909]: E1002 20:41:33.610630 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4777h_openshift-machine-config-operator(31958374-7b04-45be-9509-c51e08f9afe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4777h" podUID="31958374-7b04-45be-9509-c51e08f9afe2"